My ASP.NET application allows users to upload and download large files. Both procedures involve reading and writing filestreams. What should I do to ensure the application doesn't hang or crash when it handles a large file? Should the file operations be handled on a worker thread for example?
views:
859answers:
4Make sure you properly buffer the files so that they don't take inordinate amounts of memory in the system.
e.g. excerpt from a download application, inside the while loop that reads the file:
// Read the data in buffer.
length = iStream.Read(buffer, 0, bufferSize);
// Write the data to the current output stream.
Response.OutputStream.Write(buffer, 0, length);
Where bufferSize is something reasonable, e.g. 100000 bytes, the trade-off is that it will be slower for smaller buffer sizes.
http://support.microsoft.com/kb/812406
Edit: Also be sure that IIS is set to take a large enough request length (IIS7) and timeout.
Unless this is the primary purpose of your site consider partitioning these operations to a separate application, e.g. a sub-application or sub-domain. Besides reducing risk this would also simplify scaling out as your user base grows.
Thanks.
Turnkey - is there any practical difference between doing this (which is what I was doing):
long fileSize;
byte[] fileContent;
using (FileStream sourceFile = new FileStream(filePath, FileMode.Open))
{
fileSize = sourceFile.Length;
fileContent = new byte[(int)fileSize];
sourceFile.Read(fileContent, 0, (int)sourceFile.Length);
sourceFile.Close();
}
HttpContext.Current.Response.BinaryWrite(fileContent);
and this (which is what MS suggest in the knowledgebase article):
fileStream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
fileStreamLength = fileStream.Length;
while (fileStreamLength > 0)
{
if (HttpContext.Current.Response.IsClientConnected)
{
length = fileStream.Read(buffer, 0, 10000);
HttpContext.Current.Response.OutputStream.Write(buffer, 0, length);
HttpContext.Current.Response.Flush();
buffer = new Byte[10000];
fileStreamLength = fileStreamLength - length;
}
else
{
fileStreamLength = -1;
}
}
(code edited for brevity).
For example, does the first method actually use more server memory?