views:

85

answers:

5

I have a PHP script that you can upload very large files with (up to 500MB), and the file's content is stored in a MySQL database. Currently I do something like this:

mysql_query("INSERT INTO table VALUES('')");

$uploadedfile = fopen($_FILES['file']['tmp_name'], 'rb');
while (!feof($uploadedfile)) {
    $line = mysql_escape_string(fgets($uploadedfile, 4096));
    mysql_query("UPDATE table SET file = CONCAT(file, '$line') WHERE something = something");
}
fclose($uploadedfile);

This of course does a bloody lot of sql queries.

I did that rather than something like

$file = file_get_contents($_FILES['file']['tmp_name']);
mysql_query("INSERT INTO table VALUES('$file')");

because that would use up however much memory the file was, and it seemed better to do more sql queries than to use 500 mb of memory.
However, there must be a better way. Should I go ahead and do it the file_get_contents way or is there a better way than CONCAT, or is the way I'm doing it now the lesser of all evils?

+2  A: 

I always store my files on the server, and store their location in the database.

Liam Bailey
I chose store files in an sql database because that option doesn't work for me. I wouldn't go to all this trouble if I could just save it on the disk.
Nicholas Tine
@nickolas. Could you explain why? My first thought was file systems are good for file...
Preet Sangha
@Liam: Good answer. Storing the whole file in the table is like parking your car in your pocket - better to park the car in the garage and put the key in your pocket, same with files, better to store the file in a folder and it's location in the table.
Lucanos
A: 

I would imaging that the most effective way to do that would be to do all the validation in the script UP TO the point of the insert, then shell out and do a file move of the uploaded $FILES temp file piped into a MySQL command line insert query. You'd want someone better in bash than me to validate that but it seems it would pretty much remove the memory issue?

FatherStorm
A: 

This wouldn't actually work (by default) with mySQl, because that would cause a 500 MB big query.

$file = file_get_contents($_FILES['file']['tmp_name']);
mysql_query("INSERT INTO table VALUES('$file')");

because the max_allowed_packet is set to 16777216. You would either be required to increase it or split it in chunks smaller than 16 MB (minus query ~500-1000 bytes for the query string).

You can find out the max_allowed_packet of your mysql server by doing querying

SELECT @@global.max_allowed_packet
Tseng
+1  A: 

Could you store these files to a 3rd party file storage service?

Daniel
Nope, have to store them in an sql database.
Nicholas Tine
Why? a database is for data not files
Liam Bailey
+1  A: 

I have yet to see an application that actually needs to store files in a relational database.

There are a significant number of freely available, powerful, databases out there that are designed and optimized specifically for storing/retrieving files. They're called filesystems

Store your files in your filesystem, and your metadata in the RDBMS.

You're worried about using up 500MB of memory while inserting, and it's not clear why. You're eventually going to want to get those files back out of the database, and I don't think you'll find a way to read the file data out in chunks.

You keep saying that you "need to" store files in the database.

I call shenanigans.

timdev