views:

73

answers:

2

I have a storedprocedure which accepts

@MissingRecordsXML NTEXT

It contains XML records. Now someother developer has used a cursor to fetch rows from OPENXML and then has applied all the business rules on each row and after that each row is inserted in a table.

I want to know which is the best way to foreach XML data in SQL Server.I have to enhance performance of this SP as it is really slow when there are multiple records.Cursors are already declared as Read-Only.Please help

Code which fills the cursor from XML is:-

exec sp_xml_preparedocument @hDocSEC OUTPUT,@MissingRecordsXML  

DECLARE SEC_UPDATE CURSOR  FOR  
SELECT MissingColumn,TableName,PhysicalColName,Grantor_Grantee  
FROM OPENXML (@hDocSEC,'MissingDS/MissingTable',2)  
WITH (MissingColumn VARCHAR(1000),TableName VARCHAR(100),  
PhysicalColName VARCHAR(100),Grantor_Grantee VARCHAR(100) )  
OPEN SEC_UPDATE  
FETCH NEXT FROM SEC_UPDATE  
INTO @MissingColumn,@TableName,@ActualColumnName,@Grantor_Grantee
A: 

I am not expert on XML within SQL, but I do have some scars from dealing with it. I strongly recommend looking into SQL's XQuery options (new in SQL 2005, in BOL or here online), or perhaps the SQL 2000 functionality (start by reading up on "sp_xml_preparedocument"--there's underscores bracketing that "xml", but SO doesn't like rogue underscores--again in BOL or there).

(edited)

I disabled the links, as they stopped working a few minutes after I found them (live and learn). Your new samples show you already know about the sp_XML... statements, and you're getting valid XQuery samples, so you should be good.

Philip Kelley
+2  A: 

Assuming your sample XML looks something like this:

<MissingDS>
  <MissingTable>
    <MissingColumn>abc</MissingColumn>
    <TableName>tblMyTable</TableName>
    <PhysicalColName>table_abc</PhysicalColName>
    <Grantor_Grantee>nobody</Grantor_Grantee>
  </MissingTable>
  <MissingTable>
    <MissingColumn>xyu</MissingColumn>
    <TableName>tblMyTable2</TableName>
    <PhysicalColName>table_xyz</PhysicalColName>
    <Grantor_Grantee>nobody2</Grantor_Grantee>
  </MissingTable>
</MissingDS>

Then you could parse this with the new SQL Server 2005 XQuery support like this:

DECLARE @MissingXML XML
SET @MissingXML = CAST(@MissingRecordsXML AS XML)

SELECT
    Missing.Rec.value('(MissingColumn)[1]', 'varchar(1000)') AS 'MissingColumn',
    Missing.Rec.value('(TableName)[1]', 'varchar(100)') AS 'TableName',
    Missing.Rec.value('(PhysicalColName)[1]', 'varchar(100)') AS 'Physical',
    Missing.Rec.value('(Grantor_Grantee)[1]', 'varchar(100)') AS 'Grantor_Grantee'
FROM
    @MissingXML.nodes('/MissingDS/MissingTable') AS Missing(Rec)

Of course, if you can SELECT it, you can also INSERT that same data rows into a table quite easily:

INSERT INTO 
  dbo.MissingDSTable(MissingColumn, TableName, PhysicalColName, Grantor_Grantee)
    SELECT
       Missing.Rec.value('(MissingColumn)[1]', 'varchar(1000)') AS 'MissingColumn',
       Missing.Rec.value('(TableName)[1]', 'varchar(100)') AS 'TableName',
       Missing.Rec.value('(PhysicalColName)[1]', 'varchar(100)') AS 'Physical',
       Missing.Rec.value('(Grantor_Grantee)[1]', 'varchar(100)') AS 'Grantor_Grantee'
    FROM
       @MissingXML.nodes('/MissingDS/MissingTable') AS Missing(Rec)

Hope this helps a bit

Marc

marc_s
I find XQuery to be a pain, but once you get it working, it works very well indeed.
Philip Kelley
@Both: ...and no "sp_xml_preparedocument" too
gbn
Marc-It was of great help.Can i select one row at a time and do some processing and then insert it into some other table using the above specified approach.
Rohit
@Rohit: sure - you can either just apply your processing directly in the SELECT, or you can SELECT into a temporary table and deal with it there
marc_s
@Marc-In case I am using a TemporaryTable,I will have to loop thru each record and perform some login.Will it be faster than using a cursor as in that case also I am looping thru rows One by One.
Rohit
Well, in SQL, you **shouldn't** be looping through row by row. SQL is **set-based** - think in sets, act in sets, handle sets - don't fall back to row-by-row processing.....
marc_s