views:

157

answers:

2

As We are working on a asp .net project there three ways one can update data into database when there are multiple rows updation / insertion required

Let's assume we need to update employee education detail (which could be 1,3,5 or 10 records)

Method to Update Data

  1. Pass value as parameter (Traditional approach), If 10 records are there then 10 round trip required

  2. Pass data as xml and write logic inside your stored procedure to get that data from xml and update the table (only single roundtrip required)

  3. Use Table valued parameters (only single roundtrip required)

Note : Data is available as List, so i need to convert it to xml or any other format if i need to pass.

There are no. of places in entire application we need to update data in bulk (or multiple records)

I just need your suggestions that

  1. Which method will be faster (please mention if there are some other overheads)

  2. Manageability or testability concern with any approach

  3. Any other bottleneck or issue with any of the approach (Serialization /Deserialization concern or limit on size of the data passing)

  4. Any other method you suggest for same operations

Thanks

+1  A: 

The Table-Valued Parameter approach will most likely be the best approach, since you can update a whole batch of rows at once; after all, you get a table which you can join against easily.

The other approaches both are either row-by-row which is inherently slower, or require a fair bit of mucking on the SQL Server side of thing; this is not only not really fun, usually, but also more error-prone and typically less performant than just simply joining two tables.

This is exactly the scenario the TVP have been introduced for - to solve that "row-by-row" or "messing-around-with-XML" problem. I would believe there's a good reason for Microsoft to introduce this, and if they do, you should definitely give it a good try and see if it works.

But again: that's just a "gut feeling" without really knowing all your details. Only you can really find this out, for yourself, by testing all three options. There are a plethora of other effects and parameters that might come into play that anyone answering can't possibly know....

marc_s
we have a normal order management solution where there is one order master and other order detail (products), address detail etc. same way it will move to billing...can you suggest any possible issues with table valued parameter ...that you feel we need to consider before going ahead (or any side effect of that approach)
Harryboy
A: 

If you use TVP's you can use the merge statement to manage insert/update/delete. (Cool new feature of SQL2008).

The only limitation of the TVP that I've seen is the 1000 row limit. As were this isn't any issue with xml or doing row by row.

I implemented a row by row and TVP solutions for a data import I was doing (thousands of rows). TVP's won hands down (minutes down to seconds). So I would suggest do both - it will also give you metrics for telling your boss how cool you are, by making stuff run faster!

Things you need to consider; What's the processor over head of converting for data from row by row, to TVP or XML. This is based on your data. (Going to involve more loops or serialization)

Check parallelism - I've seen examples of where table parameters will only run over one core!! - So in test the speed increase you see, might not be reflected in live.

MarkPm