views:

76

answers:

1

We have a client application that allows users to download full length 192Kb/s MP3 audio files. Because the files are stored externally to us as a business, we need to be able to:

1) Copy file from external location into a local Server cache

2) Copy that file to the client that requested it

Obviously further requests on the same file would come from the cache and would not need to go external.

Now, we already have a current system that does this (using a Squid Cache), but the problem is that 2 only executes once 1 is fully complete. This means that if a 10min long 192kb/s track takes 75 seconds to be copied from an external location into the cache, the client's HTTP timeout kicks in at about 60 seconds! This does not fulfil our requirements.

It seems that what we need is a cache that can transfer out to a client WHILE it is getting data from an external location. And my questions are:

1) Can this be done with a Squid Cache (this is the legacy incumbent and not my choice)?

2) If not, what technology would be the most suited for this kind of scenario (cost is not really an issue)?

Please let me know if this isn't clear in any way!

+1  A: 

Here's an asp.net handler I wrote a while back to proxy some stuff from another server. It wouldn't be that hard to write to file and use the file second time round. Flushing the response in the loop would make it deliver while downloading:

namespace bla.com
{
 /// <summary>
 /// Summary description for $codebehindclassname$
 /// </summary>
 [WebService(Namespace = "http://tempuri.org/")]
 [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
 public class Proxy : IHttpHandler
 {
  private static Regex urlRegex=new Regex(@"http://some_regex_here_to_prevent_abuse_of_proxy.mp3",RegexOptions.Compiled);
  public void ProcessRequest(HttpContext context)
  {
   var targetUrl = context.Request.QueryString["url"];
   MatchCollection matches = urlRegex.Matches(targetUrl);

   if (matches.Count != 1 || matches[0].Value != targetUrl)
   {
    context.Response.StatusCode = 403;
    context.Response.ContentType = "text/plain";
    context.Response.Write("Forbidden");
    return;
   }

   HttpWebRequest req = (HttpWebRequest) WebRequest.Create(targetUrl);
   Stream responseStream;
   using (HttpWebResponse response = (HttpWebResponse)req.GetResponse())
   {
    responseStream = response.GetResponseStream();
    context.Response.ContentType = response.ContentType;

    byte[] buffer = new byte[4096];
    int amt;
    while ((amt = responseStream.Read(buffer, 0, 4096))>0)
    {

     context.Response.OutputStream.Write(buffer, 0, amt);
     Debug.WriteLine(amt);
    } 
    responseStream.Close();
    response.Close();
   }
   context.Response.Flush();
  }

  public bool IsReusable
  {
   get
   {
    return false;
   }
  }
 }
}
spender
you could also increase the length of the timeout, to allow for larger files.
Rick Ratayczak