views:

694

answers:

2

I have an app that needs to handle very large strings between a SQL Server database and .NET code. I have a LINQ query that generates the strings when saving them to the database, but when trying to create the strings from the database, the app crashes with an OutOfMemoryException because of the size of the strings.

Do I have to do something to make the LINQ generated code avoid that? Using some kind of compression might be an option, but would like to avoid that for performance reasons.

+1  A: 

What do you call "very large"? And what is the string? CLOB? BLOB? xml?

I suspect you should be using things like ExecuteReader(), which (via IDataReader) exposes methods for reading such columns in chunks:

        using (var reader = cmd.ExecuteReader(
            CommandBehavior.SequentialAccess)) {
            char[] buffer = new char[8040]; // or some multiple (sql server page size)
            while (reader.Read()) {
                long dataOffset = 0, read;
                while((read = reader.GetChars(colIndex, dataOffset, buffer, 0, buffer.Length)) > 0) {
                    // process "read"-many chars from "buffer"
                    dataOffset += read;
                }
            }
        }

Obviously with xml you might want an XmlReader via cmd.ExecuteXmlReader().

Updated re LINQ comment (now deleted):

To use IDataReader directly from LINQ-to-SQL, I expect the closest you can get is ctx.GetCommand(), passing it a query. You would then use ExecuteReader or ExecuteXmlReader as above. I don't know much about EF...

If you give an example of the type of query that is failing, there might be some tricks possible - for example, if you are filtering or selecting subsets of the xml, there are things you can do in SQL/XML - perhaps in a UDF called via LINQ-to-SQL.

Marc Gravell
A: 

I haven't measured the size of the strings, but they quickly reach .Net limits. The information is XML "stored" in a XStreamingElement object generated by a LINQ query. I will give a shot to your IDataReader suggestion to see if it solves the problem. Basically, I'm reading back with LINQ to SQL, is there any hook so I can provide that IDataReader chunking code?

Román
Note I updated re dataOffset... the original would keep reading the first page...
Marc Gravell
Re measuring... SELECT AVG(DATALENGTH(col)) FROM [table] would do it...
Marc Gravell