views:

418

answers:

3

I'm trying to send a XML of aproximately 1MB as XML parameter in a Stored Procedure, but always the connection returns timeout.

Anyone knows what's the size limit for the XML type?


Environment:

  • Microsoft SQL Server 2005 Express
  • .NET Framework 2.0
  • C#


C# Code:

using (SqlCommand commandSave = new SqlCommand("SaveScanning", this.DatabaseConnection))
{
    commandSave.CommandType = System.Data.CommandType.StoredProcedure;

    SqlParameter scanningData = new SqlParameter("ScanningData", System.Data.SqlDbType.Xml);
    scanningData.Value = new SqlXml(new XmlTextReader(**HEREISTHEXMLSTRING**, XmlNodeType.Document, null));
    commandSave.Parameters.Add(scanningData);

    commandSave.ExecuteNonQuery();
}

SQL Code:

CREATE PROCEDURE [dbo].[SaveScanning]
(
    @ScanningData XML
)
AS
BEGIN
    .
    .
    .
A: 

Timeout means the query is taking too long, not that the parameter is too large.

Of course, it's possible that your query takes too long because the parameter is too large. Try the same query, with the same parameter, in SQL Server Management Studio (download the version for Express).

John Saunders
I've tried in SQL Server Management Studio and the process performed in just 2 seconds...
Zanoni
+1  A: 

2 GB is the max size for the XML data type. I've done XML Parameters with 20MB worth of text that result in 1000 rows inserted and this takes a quad xeon box that averages 500 user spids w/ connection pooling that sits around 25% cpu, w/ 16 GB of ram about 10 seconds.

Edit:

What XML method are you using with your SQL Select that is having the issue?

XQuery? XPath? Open XML?

If you could provide more of your T-SQL it would help.

DBAndrew
That's the problem... I'm trying to insert 5000 records from a Select in XML... More than 1000 records and SQL Server returns timeout...
Zanoni
5000 records is what size in XML text? The 1000 records I mentioned earlier was going into a table with about 50 columns made up of lots of varchar(50). If your 5000 records is only 5 columns wide that is nothing. When I was testing my initial XML import I recall testing up to 20,000 records which was over 400MB of xml text and IIRC it took around 2.5-3 min to complete.
DBAndrew
A: 

Finally I found a solution:

The OPENXML "function" to read XML it's so far the best way.

CREATE PROCEDURE InsertXMLData
(
    @XMLDoc XML
)
AS


Faster way:

DECLARE @handle INT

EXEC sp_xml_preparedocument @handle OUTPUT, @XMLDoc

INSERT INTO TestTable
SELECT * FROM OPENXML (@handle, '/authors', 2) WITH 
  (au_id INT,
   au_lname VARCHAR(20),
   au_fname VARCHAR(20)
  )
EXEC sp_xml_removedocument @handle

Very slow way:

INSERT INTO TestTable
SELECT
    x.item.value('@au_id', 'INT') AS au_id,
    x.item.value('@au_lname', 'VARCHAR(20)') AS au_lname
    x.item.value('@au_fname', 'VARCHAR(20)') AS au_fname
FROM
    @XMLDoc.nodes('/authors') x(item)
Zanoni