views:

227

answers:

2

I am trying to find the best way (most efficient way) to post large files from a python application to a Django server.

If I rely on raw_post_data on the Django side then all the content needs to be in RAM before I can read it which doesn't seem efficient at all if the file received is 100s of megs.

Is it better to use the file uploads methods Django has. This means using a multipart/form-data post.

or maybe something better ?

Laurent

+3  A: 

I think only files less than 2.5MB are stored in the memory, any file that is larger than 2.5MB is streamed or written to temporary file in temp directory..

reference: http://simonwillison.net/2008/Jul/1/uploads/ and here http://docs.djangoproject.com/en/dev/topics/http/file-uploads/

Mohamed
+1 Plus django file objects supply the methods to chunk
czarchaic
Mohamed: Are you referring to raw_post_data or to the upload methods ?
Laurent Luce
+1  A: 

If you really want to optimize it and don't want Django to suffer whilst the bytes are being streamed and thus occupying one of the Django threads you can use the nginx upload module (see also this blog post)

Peter Bengtsson
I am using Lighttpd and Django and I am hoping I could stick to that for now.
Laurent Luce