views:

36

answers:

0

I am writing a C# class library to transfer large volumes of data through COM automation using an IStream. It uses the CreateStreamOnHGlobal API call to create the stream, and the methods within System.Runtime.InteropServices.COMTypes.IStream to work with it.

My question is, when transferring large volumes of data, what is the best way to keep the memory footprint under control? Loading 100MB+ of file data into memory seems wasteful, and the client application would need to wait until that process is complete before downloading anything.

My plan was to create a reasonably sized stream and write to it multiple times. Before writing the next chunk of data, Seek back to the beginning and overwrite starting at the beginning. Am I going about this the right way, and is there a better method to solve this problem?