views:

977

answers:

2

The title speaks for itself.

I'm dealing with files/data near 2MB in size.

+1  A: 

SqlDataReader.GetBytes should be what you're looking for.

Matt Brunell
+1  A: 

If you need the bytes in an array, then all the methods are pretty much going to be the same, since you have to stream all the bytes from the result stream into the array (the SqlClient classes are going to do this for you).

It doesn't matter if you have a varbinary output parameter on a stored procedure or a field in a database table, although the stored procedure will be slightly faster since you don't have schema information about a result set to return. This is negligible compared to the size of the data you are shuttling back, however.

If you really want a performance improvement, you are better off using a DataReader with SequentialAccess set and then calling GetBytes on the reader for the field that is returned. By calling this, you pull only the bytes you need and can process the data in chunks.

This is important, because you are definitely going to have issues trying to allocate an array of bytes 2MB in size. Generally speaking, starting to allocate large contiguous blocks of memory (which is what arrays are) is a bad idea, and will kill performance. Where possible, process the data in smaller chunks.

casperOne
Using a buffer with GetBytes, what size of buffer is reasonable considering my 2MB total size?
Larsenal
@Larsenal: That's impossible to say, you can only make that determination in relation to the performance characteristics of your app. It's a tuning issue. I'd probably start somewhere around a few KB, assuming that's big enough for the data you wish to process.
casperOne