views:

87

answers:

2

Dear friends,

I'm supposed to do the fallowing:
1) read a huge (700MB ~ 10 million elements) XML file;
2) parse it preserving order;
3) create a text(one or more) file with SQL insert statements to bulk load it on the DB;
4) write the relational tuples and write them back in XML.

I'm here to exchange some ideas about the best (== fast fast fast...) way to do this. I will use C# 4.0 and SQL Server 2008.

I believe that XmlTextReader its a good start. But I do not know if it can handle such a huge file. Does it load all file when is instantiated or holds just the actual reading line in memory? I suppose I can do a while(reader.Read()) and that should be fine.

What is the best way to write the text files? As I should preserve the ordering of the XML (adopting some numbering schema) I will have to hold some parts of the tree in memory to do the calculations etc... Should I iterate with stringbuilder?

I will have two scenarios: one where every node (element, attribute or text) will be in the same table (i.e., will be the same object) and another scenario where for each type of node (just this three types, no comments etc..) I will have a table in the DB and a class to represent this entity.

My last specific question is how good is the DataSet ds.WriteXml? Will it handle 10M tuples? Maybe its best to bring chunks from the database and use a XmlWriter... I really dont know.

I'm testing all this stuff... But I decided to post this question to listen you guys, hopping your expertise can help me doing this things more correctly and faster.

Thanks in advance,

Pedro Dusso

+3  A: 

I'd use the SQLXML Bulk Load Component for this. You provide a specially annotated XSD schema for your XML with embedded mappings to your relational model. It can then bulk load the XML data blazingly fast.

If your XML has no schema you can create one from visual studio by loading the file and selecting Create Schema from the XML menu. You will need to add the mappings to your relational model yourself however. This blog has some posts on how to do that.

mancaus
Can I create this XSD programmatically? I will receive a unknow XML file, with no schemas attached.
Pmdusso
I studied the SQLXML bulk load. It is for a very specific scenario, where you alredy have a xsd very well constructed. I has many guidelines and limitations. It will be to difficult to generate a good xsd schema lo load it after unknowing the file which will come :(
Pmdusso
+1  A: 

Guess what? You don't have a SQL Server problem. You have an XML problem!

Faced with your situation, I wouldn't hesitate. I'd use Perl and one of its many XML modules to parse the data, create simple tab- or other-delimited files to bulk load, and bcp the resulting files.

Using the server to parse your XML has many disadvantages:

  1. Not fast, more than likely
  2. Positively useless error messages, in my experience
  3. No debugger
  4. Nowhere to turn when one of the above turns out to be true

If you use Perl on the other hand, you have line-by-line processing and debugging, error messages intended to guide a programmer, and many alternatives should your first choice of package turn out not to do the job.

If you do this kind of work often and don't know Perl, learn it. It will repay you many times over.

James K. Lowden