views:

36

answers:

1

Our site provides a upload form for our members to upload photos which we then store and allow them to share. We use a simple form POST to enable the upload and then process the files with Perl's CGI.pm. Here is our Apache setup:

Apache/2.0.63 (Unix) mod_ssl/2.0.63 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635

For some reason small percentage of our users are running into an issue where the upload times out and fails. Here is what is reported in our Apache logs on failure:

(104)Connection reset by peer: Error reading request entity data, referer: http://domain.com/upload/photo

At first we thought that it might be due to not having the $CGI::POST_MAX set too low so large photos were being rejected. Even after increasing this to 100 MB it is still occurring. We can't replicate the issue and there doesn't seem to be any rhyme or reason to which users are running into problems since it's happening in different browsers, operating systems, etc. We're also not sure if it's an issue related to Perl or to our Apache settings.

I'd appreciate any advice on what might be causing this to happen and suggestions on how to resolve the problem. Thanks in advance for your help!

A: 

You can try to increase "Timeout" setting in Apache's config in case it's just the network having slowness spikes.

Also, please see this: http://www.mail-archive.com/[email protected]/msg00457.html - there might be a 64MB limit according to that post though I don't see any corroborating evidence.

Also, see this example on how to set read limit: http://permalink.gmane.org/gmane.comp.apache.mod-perl/24260

DVK
KeepAlive is enabled so I didn't think that would be an issue. MaxKeepAliveRequests is currently set to 100. Is that the value you are suggesting we should increase? If not, what specifically in Apache do you think we should increase and more importantly - why?
Russell C.
I just noticed the Timeout variable with is set to 200. Maybe that is what you were referring to. Is that low or high. What would you recommend changing that to?
Russell C.
@Russel - the latter. Timeout. Since those are internet users and files are large, 200 might be the culprit, though not 100% surely so - try changing to, say 400 and see if an average # of errors like this drops over a span of several days
DVK
@DVK - that not the best solution since it will mean an increased load on our servers for all other requests that aren't upload related. Is there some kind of JS trick we could use to keep the session alive instead for just those users uploading files?
Russell C.
@Russell - There might be (there are nulti-part upload things being done via JS by Google i think), or you can do Flash stuff.
DVK
@DVK - can you provide more details on other methods to help address this issue? I'd prefer JS but if flash is the only option that would be better than where we currently are. Thanks!
Russell C.
To be honest, I don't recall at the moment, but if you search stackoverflow for multi-part upload, you should hit a couple of Qs where this was discussed with details as far as I recall
DVK