views:

4196

answers:

5

fopen is failing when I try to read in a very moderately sized file in PHP. A 6 meg file makes it choke, though smaller files around 100k are just fine. i've read that it is sometimes necessary to recompile PHP with the -D_FILE_OFFSET_BITS=64 flag in order to read files over 20 gigs or something ridiculous, but shouldn't I have no problems with a 6 meg file? Eventually we'll want to read in files that are around 100 megs, and it would be nice be able to open them and then read through them line by line with fgets as I'm able to do with smaller files.

What are your tricks/solutions for reading and doing operations on very large files in PHP?

Update: Here's an example of a simple codeblock that fails on my 6 meg file - PHP doesn't seem to throw an error, it just returns false. Maybe I'm doing something extremely dumb?

$rawfile = "mediumfile.csv";

if($file = fopen($rawfile, "r")){
fclose($file); } else { echo "fail!"; }

Another update: Thanks all for your help, it did turn out to be something incredibly dumb - a permissions issue. My small file inexplicably had read permissions when the larger file didn't. Doh!

A: 

Well you could try to use the readfile function if you just want to output the file.

If this is not the case - maybe you should think about the design of the application, why do you want to open such large files on web requests?

Fionn
We've got to automate adding large sets of data, so large CSV files can be uploaded by the user and are parsed and integrated into the database by the application. I'd love other suggestions for approach if you think reading and parsing uploaded files with PHP isn't the best way to go.
Erik
I wouldn't think PHP would have a problem with 6MB csv files? Seems like a small enough file for it to handle. As per the comments above, please post the exact error/and or code. Could be memory error your hitting? Or a max_execution_time? We need more info to help.
DreamWerx
A: 

I used fopen to open video files for streaming, using a php script as a video streaming server, and I had no problem with files of size more than 50/60 MB.

Enreeco
A: 

Have you tried file() ?

http://is2.php.net/manual/en/function.file.php

Or file_ get_contents()

http://is2.php.net/manual/en/function.file-get-contents.php

Ólafur Waage
+3  A: 

Are you sure that it's fopen that's failing and not your script's timeout setting? The default is usually around 30 seconds or so, and if your file is taking longer than that to read in, it may be tripping that up.

Another thing to consider may be the memory limit on your script - reading the file into an array may trip over this, so check your error log for memory warnings.

If neither of the above are your problem, you might look into using fgets to read the file in line-by-line, processing as you go.

$handle = fopen("/tmp/uploadfile.txt", "r") or die("Couldn't get handle");
if ($handle) {
    while (!feof($handle)) {
        $buffer = fgets($handle, 4096);
        // Process buffer here..
    }
    fclose($handle);
}

Edit

PHP doesn't seem to throw an error, it just returns false.

Is the path to $rawfile correct relative to where the script is running? Perhaps try setting an absolute path here for the filename.

ConroyP
A: 

If the problem is caused by hitting the memory limit, you can try setting it a higher value (this could work or not depending on php's configuration).

// this sets the memory limit to 12 Mb

ini_set("memory_limit","12M");

Juan Pablo Califano
Note: While this may help, it only postpones the problem: once a 15 MB file comes in, the problem comes back. (If your files won't ever go over a certain limit, this may make the problem go away.)
Piskvor