i'd like to read binary data from a blob, using the Stream interface around it.
But i don't want the blob to have to be loaded entirely client side and stored in memory, or in a file.
i want the code that uses the blob to be able to seek and read, and only as much data that is needed to support seek/read is brought over the wire.
i.e. Pretend the blob is a 250MB Photoshop image. The thumbnailer code knows how to read the first 8 bytes of an image, recognize it's a PSD file, seek to the offset that will contain the 3k thumbnail, and read that.
So rather than trying to allocate 250MB of memory, or having to create a temporary file, and having to wait for 250MB to be brought over the wire: the hypothetical SQLServerBlobStreamServerCursor class knows how to data traffic to what which is actually asked for.
Research
HOW TO: Read and Write a File to and from a BLOB Column by Using Chunking in ADO.NET and Visual Basic .NET Which talks about being able to read, and write, in chunks. But the code is unreadable being cut off like that i can't stand it. i'll look at it later.
Also this guy mentioned a new SQL Server 2005 [column.Write()]3 T-SQL syntax to write data - could be used to write data in small chunks (to avoid consuming all your server's memory). Maybe there's a [column].Read() pseudo-method
Microsoft has an article: Conserving Resources When Writing BLOB Values to SQL Server