views:

277

answers:

3

I want to generate XML file from one object (Contains nested collection) with large amount of data. but there is a limitation with XML that it can't exceed 50MB.

Are there any good way to do this?

Update : speed is not important, the main thing is split into 50MB for each file

A: 

So what do you want to do when it does? How will you determine what to leave out?

Noon Silk
I will split into multiple xml with max 50MB.
pang
How would you perform the split? Potentially you could use an archive tool to do the splitting. Basically I'm suggesting you'll need to implement your own XML-serialisation code and decide how to break it up, depending on how much data you write. But if you can 'trick' your way out of that by getting WinRAR or similar to split for you, then maybe that is an option.
Noon Silk
+1  A: 

Have you considered writting the XML file like a string instead of using the XML support in .NET.

I was writing ~10GB of data to XML, as it was the only way a tool could consume it.

I had a problem like this but my XML was so simple I just used a TextWriter and nested for loops to write the XML.

Worked a charm, plus was a lot faster than the XML object.

ben
Everything is faster than the xml object ;)
Chris Lively
+2  A: 

You can write big xml file with XmlWriter or XDocument without any problem.

Here a sample example. This example generates a 63MB xml file in less than 5 seconds. For this example, I use the class XmlWriter.

using (XmlWriter writer = XmlWriter.Create("YourFilePath"))
{
    writer.WriteStartDocument();

    writer.WriteStartElement("Root");

    for (int i = 0; i < 1000000; i++) //Write one million nodes.
    {
        writer.WriteStartElement("Root");
        writer.WriteAttributeString("value", "Value #" + i.ToString());
        writer.WriteString("Inner Text #" + i.ToString());
        writer.WriteEndElement();
    }
    writer.WriteEndElement();

    writer.WriteEndDocument();
}
Francis B.
I have written / read xml files of multiple gigabytes using this method it works fine. For extra credit you can hook it up through a GzipStream to compress the file as well...
Ben Childs