views:

109

answers:

3

I am trying to read large files lets say illustrator file or photoshop file using cron job in my system. Files size varies from 20 mb - 300 mb

I have been using some function but it break in middle while reading. So i wanted to have a fresh opinion.

Amount these function

  • file_get_contents
  • readfile
  • curl

which is most effective in terms of

  • consistency (should not break while reading file)
  • speed
  • resource uses

if there is more then two cron job, does it impact over all server performance.

Please share best practice code.

Thanks in advance

A: 

you need to split the two tasks if files are so big. first you download the file with wget and once you have your file you process it with php. this way you are less likely to go into timeout problems.

if you don't know which file to download because it is a variable from php of some sort you can write to a file the name of the required file as first step of your job

then pass it to wget via --input-file=file as second step

and then process it as third and final step with your php program

sathia
A: 

DirectIO is a low level extension that bypasses the OS and goes straight to the hard disk, as a result it is probably the most efficient.

http://php.net/manual/en/ref.dio.php

Note that as of PHP 5.1.0 it is no longer bundled with PHP. Also, if your script is breaking in the middle of the operation, check your max_execution_time and max_memory.

theAlexPoon
+2  A: 

Use cURL. The file functions have been deprecated in favor of cURL to open remote files. It's not only faster, but also more reliable1 (you are less likely to experience timeouts).

If your script times out or runs out of memory anyways, you'll want to increase the execution time and memory limits (max_execution_time and memory_limit).

Other notes:

  • readfile() reads a file and prints it to the output buffer; it's no the same thing as file_get_contents().
  • If you compile curl with --with-curlwrappers then when you do file_get_contents() it will use cURL instead of the fopen() functions.

1 Citation needed.

NullUserException