large-file-support

How to solve this compatibility-problem regarding large file support?

A library using off_t as a parameter for one function (seek). Library and application are compiled differently, one with large file support switched off, the other with large file support. This situation results in strange runtime errors, because both interpret off_t differently. How can the library check at runtime the size of off_t for...

handling large uploads on django, exceeding the max size on nginx

we have a django app on nginx where users upload media files. the media are huge such as 30min tv and radio programs resulting 100-300mb, and our shared hosting limits the upload to 30mb. how to embed a smart uploader which will put chunks of 20-30mb instead of trying to upload the large file? we would like not to destroy our highly edi...

Python class to merge sorted files, how can this be improved?

Background: I'm cleaning large (cannot be held in memory) tab-delimited files. As I clean the input file, I build up a list in memory; when it gets to 1,000,000 entries (about 1GB in memory) I sort it (using the default key below) and write the list to a file. This class is for putting the sorted files back together. It works on the f...

Paperclip, large file uploads, and AWS

So, I'm using Paperclip and AWS-S3, which is awesome. And it works great. Just one problem, though: I need to upload really large files. As in over 50 Megabytes. And so, nginx dies. So apparently Paperclip stores things to disk before going to S3? I found this really cool article, but it also seems to be going to disk first, and then do...

Opening Large (24 GB) File In C

I'm trying to read in a 24 GB XML file in C, but it won't work. I'm printing out the current position using ftell() as I read it in, but once it gets to a big enough number, it goes back to a small number and starts over, never even getting 20% through the file. I assume this is a problem with the range of the variable that's used to s...