views:

4178

answers:

7

I am trying to upload data to Google App Engine (using GWT). I am using the FileUploader widget and the servlet uses an InputStream to read the data and insert directly to the datastore. Running it locally, I can upload large files successfully, but when I deploy it to GAE, I am limited by the 30 second request time. Is there any way around this? Or is there any way that I can split the file into smaller chunks and send the smaller chunks?

+1  A: 

you would need to do the upload to another server - i believe that the 30 second timeout cannot be worked around. If there is a way, please correct me! I'd love to know how!

Chii
That's what I was afraid of. I guess it isn't too bad... I could just use Amazon S3 for the storage. I suppose that would simplify the storage part, but it would still be nice to keep everything at the same place.
+6  A: 

Currently, GAE imposes a limit of 10 MB on file upload (and response size) as well as 1 MB limits on many other things; so even if you had a network connection fast enough to pump up more than 10 MB within a 30 secs window, that would be to no avail. Google has said (I heard Guido van Rossum mention that yesterday here at Pycon Italia Tre) that it has plans to overcome these limitations in the future (at least for users of GAE which pay per-use to exceed quotas -- not sure whether the plans extend to users of GAE who are not paying, and generally need to accept smaller quotas to get their free use of GAE).

Alex Martelli
A: 

The closest you could get would be to split it into chunks as you store it in GAE and then when you download it, piece it together by issuing separate AJAX requests.

tomjen
A: 

Hi fpoint, would you please show me how you can store the big file immediately? According to what I have read from Google, the only way to store a file is pack it in some forms of objects and store the whole object.

BlueBlood
A: 

I would agree with chunking data to smaller Blobs and have two tables, one contains th metadata (filename, size, num of downloads, ...etc) and other contains chunks, these chunks are associated with the metadata table by a foreign key, I think it is doable...

Or when you upload all the chunks you can simply put them together in one blob having one table.

But the problem is, you will need a thick client to serve chunking-data, like a Java Applet, which needs to be signed and trusted by your clients so it can access the local file-system

Sleem
A: 

If your request is running out of request time, there is little you can do. Maybe your files are too big and you will need to chunk them on the client (with something like Flash or Java or an upload framework like pupload).

Once you get the file to the application there is another issue - the datastore limitations. Here you have two options:

  • you can use the BlobStore service which has quite nice API for handling up 50megabytes large uploads

  • you can use something like bigblobae which can store virtually unlimited size blobs in the regular appengine datastore.

Honza
+2  A: 

By using the BlobStore you have a 1 GB size limit and a special handler, called unsurprisingly BlobstoreUpload Handler that shouldn't give you timeout problems on upload.

Also check out http://demofileuploadgae.appspot.com/ (sourcecode, source answer) which does exactly what you are asking.

Also, check out the rest of GWT-Examples.

voyager