tags:

views:

19

answers:

2

hey guys, i wanna download any file from any website with my forcedownload script. simple script which just prompts a download window to save the file to the desktop. I simply pass ?path=http://www.google.com/images/whatever/file.jpg to the url and the script fires the download.

However this script is of course not working for every page. Most of the time I don't have the permissions to do so!

HTTP request failed! HTTP/1.0 403 Forbidden in ....

Is there a method where i can check if the forcedownload from some domain would work or not. Like is_readable() or something.

regards matt

edit:

    <?php
    error_reporting(E_ALL);

    if(isset($_GET['p'])) $path = $_GET['p'];
    else header('Location:' . 'mypage');

    $file = $path;
    //header('Location:' . $file);
    //
    header("Cache-Control: no-cache");
    header("Expires: -1");
    header("Content-Type: application/octet-stream;");
    header("Content-Disposition: attachment; filename=\"" . basename($file) . "\";");
    header("Content-Transfer-Encoding: binary");
    //header("Content-Length: " . filesize($file));
    //echo file_get_contents($file);
    echo readfile($file);
    ?>
+3  A: 

If your script uses curl for downloading the file, you can use curl_getinfo and check for the CURLINFO_HTTP_CODE before passing the file to the user.

Viktor Stískala
+1  A: 

I created this class for my mediawiki word count project.

// Copyright PHPExperts.pro
// License: Any user on Stackflow may use this code under the BSD License.

/**
* Web page datatype that holds all the various parts
* and info about a web page.
*/
class WebPage
{
    public $url;
    public $headers;
    public $body;
    public $text;


    public function __construct($url)
    {
        // 1. Bail out now if the CURL extension is not loaded.
        if (!in_array('curl', get_loaded_extensions()))
        {
            throw new Exception(WebPageException::MISSING_CURL);
        }

        // 2. Make sure the URL is valid.
        self::ensureValidURL($url);

        // 3. Store the URL.
        $this->url = $url;
    }

    /**
    * Determine if a URL is valid.
    *
    * @param string $url
    * @returns true if the URL is a string and is a valid URL. False, otherwise.
    */
    public static function isURLValid($url)
    {
        return (is_string($url) &&
                filter_var($url, FILTER_VALIDATE_URL) !== false);
    }

    public static function ensureValidURL($url)
    {
        if (!self::isURLValid($url))
        {
            throw new WebPageException(WebPageException::INVALID_URL, array($url));
        }
    }

    // captureHeader() donated by [email protected],
    // via http://us.php.net/curl_setopt_array
    private function captureHeader($ch, $header)
    {
        $this->headers[] = $header;
        return strlen($header);
    }

    public function fetchURL()
    {
        $ch = curl_init();
        curl_setopt_array($ch, array(CURLOPT_URL => $this->url,
                                     CURLOPT_RETURNTRANSFER => 1,
                                     CURLOPT_HEADERFUNCTION => array($this, 'captureHeader'),
                                     CURLOPT_TIMEOUT => 5,
                                     )
                         );

        $data = curl_exec($ch);
        curl_close($ch);

        if ($data === false || is_null($data) || $data == '')
        {
            throw new WebPageException(WebPageException::BLANK_URL, array($this->url));

        }

        // TODO: Need to handle HTTP error messages, such as 404 and 502.
        $this->body = $data;
                // Uses code from [email protected]
                $this->text = remove_HTML($data);
    }
}

After you run WebPage::captureHeader() you then just foreach through $this->headers, and if you don't find a HTTP/1.0 403 Forbidden, you're good to go.

This thoroughly answers your question, so I expect credit.

hopeseekr