I've got a TCP server written in C# that processes POST data sent to it. Currently it works fine unless a large amount of data (i.e. greater than 1GB) is sent to it then it runs out of memory (I store it all in memory as an array of bytes (with a intermediary of a List DTO)). For large files now I stream down to disk and then pass the filename around with the intention of streaming it from disk.
Currently all of my routines are written to expect byte arrays which, in hindsight, was a little short-sighted. If I just convert the bytearray to a memorystream will it double the memory usage? I think re-writing my code to work on a memorystream will allow me to re-use it when I'm reading a stream from disk?
Sorry for the stupid questions, I'm never sure when c# takes a copy of the data or when it takes a reference.