views:

73

answers:

2

XML is used as one of our main integration points. it comes over by many clients at a time but too many clients importing at the same time can slow down our database to a crawl. Someone has to have solved a problem like this.

I am basically using VB to parse through the data and import what i want and don't want.

Is there a better way?

A: 

Without specifics it's hard to say where your slow spot is. Are you measuring where the time is being spent in your application?

I find that many times, a large data set is being materialized unnecessarily, which eats memory and can kill performance. This could happen in your scenario if you receive the XML input data and store it in an XmlDocument before you start parsing through the data. This will kill you if the XmlDocument is large.

If possible, aim to process the data incrementally by reading it with an XmlReader. Some datasets are amenable to this approach: the processing required doesn't need a lot of context, but proceeds linearly across the data. In this case you'll see a huge improvement over sucking everything down into an XmlDocument. However, other datasets are so structured that you really do have to have everything in core before continuing.

Again, it's hard to say if there is a better way without understanding how the input data is structured.

John Källén
This is interesting I didn't know that about the xml document object...
Rico
+1  A: 

Have you considered creating a SSIS package? You can efficiently import data from many different source types this way.

Here's a good starting point: http://msdn.microsoft.com/en-us/library/ms188032(v=SQL.100).aspx

BradBrening
does an SSIS Package work for a multiple Client Scenario.There could be 100 different locations for the xml files...
Rico
If the files have the same schema and they are on an accesible drive, it shouldn't be an issue.
BradBrening