views:

41

answers:

2

Hi,

First of all, I don't know if this is where I have to ask this question; so I'll count on the moderators to move it if need be.

I have a Linux PHP web hosting account on GoDaddy.

When I have to upload a file, I normally use FTP, either a client or the host's file manager.

However, if the file is one which I have to download from another website, I would prefer if I could "download" it directly to my hosting account; the reason being that I'm in Mauritius and our connection is among the slowest in the world. So I would prefer using the high (I'm just assuming it's higher) bandwidth of the host so that transfers go more quickly.

So, my question is: does anyone of you have a solution (PHP script, Java applet, or anything) that I could use to achieve that?

Thanks in advance,

Yusuf

A: 

Wget I use it for downloading wordpress straight to a server:

 # Download the title page of example.com to a file
 # named "index.html". 
 wget http://www.example.com/
 # Download Wget's source code from the GNU ftp site. wget
 ftp://ftp.gnu.org/pub/gnu/wget/wget-latest.tar.gz

The example are from the link above.

Luke
That's the only thing he didn't mention, ssh access ;). I think he'd find this rather problematic on shared hosting account.
Christian Sciberras
True story. :) Didnt really think about that
Luke
Ah!! I do have ssh access; though I don't think I have access to wget; that's the first thing I tried yesterday; they seem to have limited the commands available..
Yusuf
Check if they have `curl` or `ftp`.
sanmai
They have FTP, but can you upload to an FTP using a URL directly?I don't want to have to download the file and then re-upload it. If I have a link, I would like to upload it there directly from that link..They have curl too.
Yusuf
+1  A: 

First of this might be a security risk on your server.

Secondly, here's little untested code:

<?php

echo 'get file...';

$data=file_get_contents('http://...target-url...');

if($data===false)die('Failed getting file.');

echo 'saving file...';

$succ=file_put_contents('...target-file...',$data);

echo $succ ? 'Success' : 'Failed saving file';

?>

Usable script (put into file "down.php" in your web root):

<?php
    echo 'get file...';
    if(!isset($_REQUEST['from'])die('Fail: Parameter "from" not set.');
    if(!isset($_REQUEST['to'])die('Fail: Parameter "to" not set.');
    $data=file_get_contents($_REQUEST['from']);
    if($data===false)die('Failed getting file.');
    echo 'saving file...';
    $succ=file_put_contents($_REQUEST['to'],$data);
    echo $succ ? 'Success' : 'Failed saving file';
?>

Usage (run it in from web browser):

http://yoursite.com/down.php?from=http://yourothersite.com/file-content.txt&amp;to=/var/www/public_html/target.txt

WARNING: Make sure you remove script after use, it is a grave security issue.

Christian Sciberras
In principle, yes - but it's going to fall over trying to load large files (and since the OP says it is to avoid bandwidth issues....) since the file is loaded into memory - a better solution would be $in=fopen($url,'r'); $out=fopen($file,'w'); while (!feof($in)) fputs($out, fgets($in, BUF_SIZE));
symcbean
Good point symcbean. I've had this issue when running an uploaded SQL file and got it fixed with buffering. However, either case, you end up either increasing MAX MEMORY or MAX TIME. My option is bound to be faster, but more memory consuming, while yours slower and less memory consuming.
Christian Sciberras
maybe i'll write a little script with this piece of code then.. though i'll have to do that when i have some free time for coding; i was looking for a "lazy" solution :S
Yusuf
Yusuf, I'll write a full one for you under the edited one. Please accept this answer if it helped you out.
Christian Sciberras