tags:

views:

91

answers:

4

Data: $data = array('Alice', 'Bob', 'Carol', 'David', 'Elizabeth', 'Frank');


Method A:

file_put_contents('filename.ext', implode("\n", $data) );

Method 2:

$fp = fopen('filename.ext', 'w');
foreach($data as $name)
{
  fwrite($fp, $name . "\n");
}
fclose($fp);


Does one method have any significant penalties over the other?

Any significantly faster speed, even at a cost? at no cost?

Preferences? Is it situational? Which would you use in production code vs 1-use throwaway scripts?

Note: Please ignore any issues of checking to see if the filename is writable, or filepointer is !false. assume 0 friction, and everything just "works".

+2  A: 

From the docs for file_put_contents:

This function is identical to calling fopen(), fwrite() and fclose() successively to write data to a file.

It would seem that constructing the string and writing it "at-once" would be more efficient in terms of I/O. Doing so would allow the data to be written in a large chunk, rather than in smaller bits. This is generally preferred when considering I/O performance.

jheddings
would it change your statement if the data array wasnt 6 items, but 60000? that would mean building (via the implode) and buffering temporarily, a very large string in memory, in addition to the very large array you already have, meaning, for brief moment, you need to have 2 copies of it in ram.
Uberfuzzy
+6  A: 

Use serialize() rather than a half-baked imitation:

file_put_contents('filename.ext', serialize($data));

unless you need the file to be human readable and/or editable for whatever reason, in which case you need to carefully consider what data you're persisting to file so you can come up with a robust means of storing it.

As for file_put_contents() vs your loop, just do file_put_contents() unless you can't. It's less code and easier to read. I doubt there are any real differences with the implementation and even if they were, the performance of writing a file to disk is dwarfed by the fact that you're writing to disk. Don't sweat the small (irrelevant) stuff.

cletus
Actually, in most cases, I do actually need it in plain text "1 item per line" files, as they are work queues for other scripts, but need to be human skimmed/edited to remove a few lines.script-to-script i do use serialize, no question asked, and in which case, i use f_p_c because theres no way to get the results of serialize() except as 1 huge string anyway.
Uberfuzzy
A: 

file_put_contents() wins. fopen/write/close is just being verbose for no reason.

As Cletus says, just serialize the data with serialize().

Unless you're dealing with multibyte characters, which sometimes break serialize (but you can find various userland implementations of mb_serialize() online if you need that).

I'm not sure about the performance implications, but you might prefer storing your array as a json-serialized string (json_encode()). That way if some other system ever needs to read the file, you'll be using a standard serialization format, instead of a php-specific one. I seriously doubt there's any real performance difference one way or the other.

timdev
wish i could switch from serialize to json_serialize, but its not my code, and much of it has legacy ties to other external things i cant get changed :(
Uberfuzzy
A: 

You may also try using fputcsv

$fp = fopen('file.csv', 'w');
fputcsv($fp, $fields);
fclose($fp);

As for optimization, its PHP and the O/S that's responsible for read/write optimization.

Salman A