views:

152

answers:

3

So I have an existing ASP.NET solution that uses LINQ-to-SQL to insert data into SQL Server (5 tables, 110k records total). I had read in the past that XML could be passed as a parameter to SQL Server but my google searches turn up results that store the XML directly into a table. I would rather take that XML parameter and insert the nodes as records.

Is this possible? How is it done (i.e. how is the XML parameter used to insert records in T-SQL, how should the XML be formatted)?

Note: I'm researching other options like SQL bulk copy and I know that SSIS would be a good alternative. I want to know if this XML approach is feasible.

+3  A: 

The XML should be formatted as a normal XML document would. Then you just pass it to the stored procedure using parameter type XML.

Here's an example on how to do an insert. In this case, @p_AdditionalContactInfo is the XML, and it's in this form:

<entities>
  <entity>
    <firstName>Joey</firstName>
    ...        
  </entity>
  .. more entity records
</entities>

Here's the t-sql example:

  DECLARE @l_index int

  -- parse the records from the XML
  EXECUTE sp_xml_preparedocument @l_index OUTPUT, @p_AdditionalContactInfo
  INSERT INTO @l_AdditionalContactInfoTbl
            ( ContactInfoID
            , FirstName
            , LastName 
            , ContactTypeID
            , Title
            , Email
            , AddressLine1
            , AddressLine2
            , City
            , State
            , Zip
            , MobilePhone
            , BusinessPhone
            , UpdateDateTime )
       SELECT ContactInfoID
            , FirstName
            , LastName
            , ContactTypeID
            , Title
            , Email
            , AddressLine1
            , AddressLine2
            , City
            , State
            , Zip
            , MobilePhone
            , BusinessPhone
            , UpdateDateTime
         FROM OPENXML (@l_index, 'entities/entity', 1)
              WITH (  ContactInfoID  int          'id'
                    , FirstName      varchar(50)  'firstName'
                    , LastName       varchar(50)  'lastName'
                    , ContactTypeID  int          'contactTypeId'
                    , Title          varchar(20)  'title'
                    , Email          varchar(100) 'email'
                    , AddressLine1   varchar(100) 'addressLine1'
                    , AddressLine2   varchar(100) 'addressLine2'
                    , City           varchar(50)  'city'
                    , State          varchar(2)   'state'
                    , Zip            varchar(5)   'zip'
                    , MobilePhone    varchar(12)  'mobilePhone'
                    , BusinessPhone  varchar(12)  'businessPhone'
                    , UpdateDateTime datetime     'updateDateTime'
                   )
  EXECUTE sp_xml_removedocument @l_index
dcp
Thank you very much for your suggestion. Once I overcame some conversion hurdles I was able to take a 15min upload solution with 110k individual inserts down to a 3min upload solution with 20-30 inserts (I had to send the XML when it hit a certain size due to out of memory exceptions).
Mayo
@Mayo - You're quite welcome, I'm glad to hear the answer helped you solve your problem :)
dcp
+2  A: 

For SQL Server 2008, there's really no need anymore for legacy methods like using FOR OPENXML - that's quite a beast of an interface and not very useful.

Instead, use native SQL Server XQuery to "shred" your XML document into pieces and store those:

INSERT INTO 
   dbo.YourTableName(.... list of fields.......)
   SELECT
       nodes.entity.value('(firstName)[1]', 'varchar(50)'),
       nodes.entity.value('(lastName)[1]', 'varchar(50)'),
        ........ (more columns as needed) ...........
   FROM
       @XmlInput.nodes('/entities/entity') AS nodes(entity)
marc_s
I've found that performance wise, OPENXML can be far faster than using .nodes and many .value method calls. It my be old but it still has its place.
Daniel Renshaw
I found that this works, but performance is miserable relative to using openxml for large sets of data. That said, XQuery is more readable and would probably be appropriate in situations where performance is not an issue. Thanks for your input.
Mayo
+2  A: 

Yes you can make use of SQL Server's built-in XML support. You actually don't need to use sp_xml_preparedocument etc.

I wrote up an article on the various approaches you could take to pass a set of records in to a sproc, to be persisted to the db (CSV vs XML vs TABLE valued parameter) - have a look at that as I've given examples of the approaches and done a performance comparison of them. In short, I'd recommend checking out Table Valued Parameters - this is a much more natural way of doing this, and gave the best performance.

AdaTheDev