views:

446

answers:

4

I have a site where a user can download a file. Some files are extremely large (the largest being 323 MB). When I test it to try and download this file I get an out of memory exception. The only way I know to download the file is below. The reason I'm using the code below is because the URL is encoded and I can't let the user link directly to the file. Is there another way to download this file without having to read the whole thing into a byte array?

  FileStream fs = new FileStream(context.Server.MapPath(url), FileMode.Open,
                                                           FileAccess.Read);
  BinaryReader br = new BinaryReader(fs);
  long numBytes = new FileInfo(context.Server.MapPath(url)).Length;
  byte[] bytes = br.ReadBytes((int) numBytes);

  string filename = Path.GetFileName(url);
  context.Response.Buffer = true;
  context.Response.Charset = "";

  context.Response.Cache.SetCacheability(HttpCacheability.NoCache);
  context.Response.ContentType = "application/x-rar-compressed";
  context.Response.AddHeader("content-disposition", "attachment;filename=" + filename);

  context.Response.BinaryWrite(bytes);
  context.Response.Flush();
  context.Response.End();
A: 

I'll ask the obvious question: can you read a fixed chunk from the BinaryReader in a loop, writing it out to the Response.BinaryWrite? I don't see any reason why you need to read in the entire file to memory as a single operation.

Joe
+3  A: 

Try something like this:

 using (var br = new BinaryReader(fs))
       {

            FileStream toFile = File.OpenWrite(ToFileName);
            byte[] buff = new byte[2000];
            while (reader.Read(buff, 0, 2000) > 0)
            {
                toFile.Write(buff, 0, 2000);
                toFile.Flush();
            }
       }

The important thing is that you use a smaller buffer and flush the write stream to clear out memory.

Right now you are holding the entire file that you are downloading in both your BinaryReader and BinaryWriter. Chunking the download into a smaller buffer alleviates this burden on memory.

overstood
+11  A: 

Instead of

context.Response.BinaryWrite(bytes);

use

context.Response.TransmitFile(context.Server.MapPath(url));

This will avoid reading the entire file into memory.

Peter Mortensen
thanks Peter! that worked great.
geoff
A: 

Here is some code that I pulled off of TechNet. Just save it as chunkedfilefetch.aspx. This works for files up to 2 GB.

<%@ Page Language="VB" %>
<%@ Import Namespace="System.Data" %>
<%@ Import Namespace="System.IO" %>
<script language="VB" runat="server">
    '------------------------------------------------------------------------+
    Sub Page_Load(ByVal sender As Object, ByVal e As EventArgs)
        Const ChunkSize As Long = 10000
        Dim dlDir As String = "downloadfiles/"
        Dim strFileName As String = Request.QueryString("FileName")

        If (strFileName <> "") Then
            If (Regex.IsMatch(strFileName, "[/]") Or Regex.IsMatch(strFileName, "[\\]") Or _
                InStr(strFileName, "..") Or _
                Regex.IsMatch(strFileName, "^[^\\\./:\*\?\&quot;&lt;&gt;\|]{1}[^\\/:\*\?\&quot;&lt;&gt;\|]{0,254}$ ")) Then

                ' NOTE: The only way we should get here is if the user is fiddling with the query string
                ' TODO: Add instrumentation to detect hacking attempts.

            Else
                Dim buffer(ChunkSize) As Byte       ' Buffer to read input file.
                Dim length As Integer               ' Number of Bytes returned by read.
                Dim dataToRead As Long              ' Total bytes to read.
                Dim path As String = Server.MapPath(dlDir + Request.QueryString("FileName"))
                Dim file As System.IO.FileInfo = New System.IO.FileInfo(path)
                Dim filename As String = System.IO.Path.GetFileName(path)

                ' Open the file.
                Using iStream = New System.IO.FileStream(path, System.IO.FileMode.Open, _
                                                       IO.FileAccess.Read, IO.FileShare.Read)
                    dataToRead = iStream.Length
                    Response.ContentType = "application/octet-stream"
                    Response.AddHeader("Content-Disposition", "attachment; filename=" & filename)

                    ' Read and send the file 1 chuck at a time
                    While dataToRead > 0
                        If Response.IsClientConnected Then
                            length = iStream.Read(buffer, 0, ChunkSize)
                            Response.OutputStream.Write(buffer, 0, length)
                            Response.Flush()
                            dataToRead = dataToRead - length
                        Else
                            'prevent infinite loop if user disconnects
                            dataToRead = -1
                        End If
                    End While
                End Using
            End If

        End If
        BindFileDataToGrid("Name")
    End Sub

    '------------------------------------------------------------------------+
    Sub SortFileList(sender as Object, e as DataGridSortCommandEventArgs)
        BindFileDataToGrid(e.SortExpression)
    End Sub

    '------------------------------------------------------------------------+
    Sub BindFileDataToGrid(strSortField As String)
        Dim strPath As String = "downloadfiles/"

        Dim myDirInfo    As DirectoryInfo
        Dim arrFileInfo  As Array
        Dim myFileInfo   As FileInfo

        Dim filesTable   As New DataTable
        Dim myDataRow    As DataRow
        Dim myDataView   As DataView

        ' Add the clumns to the gris
        filesTable.Columns.Add("Name", Type.GetType("System.String"))
        filesTable.Columns.Add("Length", Type.GetType("System.Int32"))
        filesTable.Columns.Add("LastWriteTime", Type.GetType("System.DateTime"))
        filesTable.Columns.Add("Extension", Type.GetType("System.String"))

        ' Get Directory & File Info
        myDirInfo = New DirectoryInfo(Server.MapPath(strPath))
        arrFileInfo = myDirInfo.GetFiles()

        ' Iterate the FileInfo objects and extract te data
        For Each myFileInfo In arrFileInfo
            myDataRow = filesTable.NewRow()
            myDataRow("Name")          = myFileInfo.Name
            myDataRow("Length")        = myFileInfo.Length
            myDataRow("LastWriteTime") = myFileInfo.LastWriteTime
            myDataRow("Extension")     = myFileInfo.Extension

            filesTable.Rows.Add(myDataRow)
        Next myFileInfo

        ' Create a new DataView.
        myDataView = filesTable.DefaultView
        myDataView.Sort = strSortField

        ' Set DataGrid's data source and data bind.
        dgFileList.DataSource = myDataView
        dgFileList.DataBind()
    End Sub

</script>

<html>
<head>
   <title>Build Dynamic File Links</title>
</head>
<body>

<form runat="server">
<asp:DataGrid id="dgFileList" runat="server"
    BorderColor           = "blue"
    CellSpacing           = 0
    CellPadding           = 4
    HeaderStyle-BackColor = "#0051E5"
    HeaderStyle-ForeColor = "#FFFFFF"
    HeaderStyle-Font-Bold = "True"
    ItemStyle-BackColor   = "silver"
    AutoGenerateColumns   = "False"
    AllowSorting          = "True"
    OnSortCommand         = "SortFileList">

    <Columns>
        <asp:HyperLinkColumn DataNavigateUrlField="Name" DataNavigateUrlFormatString="ChunkedFileFetch.aspx?FileName={0}" DataTextField="Name" HeaderText="Name:" SortExpression="Name" />
        <asp:BoundColumn DataField="Length" HeaderText="File Size in bytes:" ItemStyle-HorizontalAlign="Right" SortExpression="Length" />
        <asp:BoundColumn DataField="LastWriteTime" HeaderText="Date Created:" SortExpression="LastWriteTime" />
        <asp:BoundColumn DataField="Extension" HeaderText="Type:" SortExpression="Extension" />
    </Columns>
</asp:DataGrid>
</form>
<hr />
</body>
</html>

I need some help getting this code to resume. In the TechNet article it mentioned these transfers can be resumable. However I believe it needs ihttphandlers or ziphandlers, which I just don't understand.. I'm not a very strong coder. However what I did was take this original code snippet. Called it fetch.aspx and passed an orderNUM through the URL, fetch.aspx&ordernum=xxxxxxx. It then reads the filename/location from the database, and chunks it out from a secured location not under the webroot.

I need a way to make this resumable, but any resumable articles I've read, assume the .ZIP is within the webroot.. i.e. http://www.devx.com/dotnet/Article/22533/1954. However I need to stream from a secured location.

I'm not a .NET coder at all, at best I can do a bit of ColdFusion, if anyone could help me modify a handler to do this, I would really appreciate it.

Requirements:

  • I Have a working fetch.aspx script that functions well and uses the above code snippet as a base.
  • Download files are large, 600 MB, and are stored in a secured location outside of the webroot.
  • Users click on the fetch.aspx to start the download, and would therefore be clicking it again if it was to fail.
  • OrderNum which is passed in the URL string is unique and could be used as the ETag.
  • Files should be resumable.
Kelvin H