tags:

views:

222

answers:

6

I am going to make an image hosting system in PHP. I wondered how I should hide my images from users so they can only be accessed through a special url.

I need to discuss all techniques that include htaccess.

+3  A: 

You write a little php script that reads the image file and sends the contents to the client. The PHP script can check its parameters and cookies and the image is saved somewhere outside the document root.

innaM
+3  A: 

Just don't store your images in the web root. Use a php file to manage access. When you want to show a file, do something like:

<?php
header('Content-type: image/jpeg');
$f = file_get_contents('path/to/image.jpg');
print $f;
?>
Seth
+8  A: 

Put the files outside of the relative root and use a script like showimage.php to grab the file from outside of the webroot and stream it down to the user. The code would look something like:

$len = filesize($filename);
header("Content-type: image/jpeg");
header("Content-Length: $len");
header("Content-Disposition: inline; filename=\"$new_filename\"");
readfile($filename);

Additionally, since you're running a script, you can do authentication/authorization in the script. This allows you to set up a modRewrite rule such as:

RewriteRule ^images/(.*)$   /showimage.php?file=$1

so that your image files could be rendred as:

www.domain.com/images/somefile.jpg

instead of:

www.domain.come/showimage.php?file=somefile.jpg
Parvenu74
tanx, i ike your way. but is that all attribute of header, or thier else could improve the process.
assaqqaf
Good call with the use of htaccess. +1
Christopher W. Allen-Poole
@assaqqaf What about the process would you like improved?
Parvenu74
A: 

Put them in a directory one-up from your document root. Then, use an .htaccess file to grab them:

RewriteBase /
RewriteRule ^(.+?)\.jpg$ ../imgs/$1.jpg
brianreavis
+1  A: 

i would prefer that instead of having a PHP process send the file and hog server resources for long while its sending the content.. create a script which copies the file over to a directory with a random name and then just send that file to the user and have another script clear out that directory after set intervals .. if creation time older than 30 mins .. delete .. that way u minimize u just need an apache process to send back the file instead of apache + php process.

Sabeen Malik
really i think about recourse in the first solution. but could you explain your idea more to get understand....
assaqqaf
ok so for instance u have ur images stored in /images/ , there is an image which the user wants to download lets say its called x.jpg. so its path is /images/x.jpg. the download link will look something like this download.php?fileid=whateverid ( u can have this mod rewritten if u want) .. download.php copies the image to /downloads/2342342asdfas.jpg ( a random name ) and redirects the user to that url. then u would have another script run by cron, which can delete files created more than X mins ago. This way u keep the load on the server to minimal and keep ur image paths anonymous.
Sabeen Malik
good idea, but what about performance???
assaqqaf
performance will be better than the first method where php sends back the content.. if u have alot of downloads u will start to see performance hit due to the first method as a PHP process will have to stay alive for as long as the file is being sent.. so if 10 ppl are downloading images and some of them are big or ppl have slow connections , the 10 php processes will stay in memory along with the apache processes. my method allows apache to take full control of the content delivery as it should and allows php process to die as soon as it has redirected to the newly created image file :)
Sabeen Malik
thanx, but the cop;y process are done through php. so isn,t the same !!!
assaqqaf
yes that is true , how ever the copy process would lets say take 500ms where as the sending content process will last for as long as the file is being downloaded , so if a 10mb file is being download on a slow connection , u can imagine how long the php process will stay alive. unless the file image is reallly small .. u cant beat the speed of that copy operation.
Sabeen Malik
thanx very much.. I really like that, and I'll taste it.
assaqqaf
Copying files around is probably the least elegant solution. If you are going to have a lot of images, using a single directory is going to end with a significant performance hit. Especially if you are moving the physical files around.
simplemotives
Thats why we have the cron script which clears out files which were created sometime ago .. this solution works cause at no point the user becomes directly in contact with the file and it frees up php from the job of sending thru the content.
Sabeen Malik
Parvenu74
We ran into this problem several years ago. The file size was 5mb+ in all cases and we tried both methods and worked out the copying over thing was cheaper on the cpu because u don't have a PHP process running for everyone thats downloading the file. At-least thats how it was explained to me by the server admins :) .. i never said it was simpler , anything simpler than you suggestion would be to allow them to see the direct url of the image. its a choice between seeing alot of spikes in server load OR few predictable ones. Thats just my opinion backed up by experience.
Sabeen Malik
"At-least thats how it was explained to me by the server admins..." In other words you have no idea what the load on the CPU is because you don't monitor it. Having a PHP script read a file to the output stream takes almost no time at all: the file is read into the output buffer -- which is managed by Apache -- and then the script is done. 5MB+ is all you're using? The technique I described and the code I showed is from a production website serving up mp3 files to AUTHENTICATED users (the PHP script checks for authentication and redirects to login if req'd) and my CPU utilitzation is squat.
Parvenu74
sir .. your solution is the best .. hands down!
Sabeen Malik
A: 

I think the best method would be to encrypt it.

ldog