views:

440

answers:

5

Is there a windows command to copy or download files from a http url to the filesystem? I've tried with copy, xcopy and robocopy and they dont seem to support http urls.

A: 

I can't remember any command line utility for this. Maybe you can implement something similar using JavaScript (with WinHttpRequest) and running it like that:

wscript your_script.js

Or just install msys with wget.

begray
+2  A: 

I am not familiar with any commands on Windows that can do that, but I always download GNU wget on Windows for these and similar purposes.

ayaz
+3  A: 

You can use a powershell script to accomplish this.
Get-Web http://www.msn.com/ -toFile www.msn.com.html

function Get-Web($url, 
    [switch]$self,
    $credential, 
    $toFile,
    [switch]$bytes)
{
    #.Synopsis
    #    Downloads a file from the web
    #.Description
    #    Uses System.Net.Webclient (not the browser) to download data
    #    from the web.
    #.Parameter self
    #    Uses the default credentials when downloading that page (for downloading intranet pages)
    #.Parameter credential
    #    The credentials to use to download the web data
    #.Parameter url
    #    The page to download (e.g. www.msn.com)    
    #.Parameter toFile
    #    The file to save the web data to
    #.Parameter bytes
    #    Download the data as bytes   
    #.Example
    #    # Downloads www.live.com and outputs it as a string
    #    Get-Web http://www.live.com/
    #.Example
    #    # Downloads www.live.com and saves it to a file
    #    Get-Web http://wwww.msn.com/ -toFile www.msn.com.html
    $webclient = New-Object Net.Webclient
    if ($credential) {
        $webClient.Credential = $credential
    }
    if ($self) {
        $webClient.UseDefaultCredentials = $true
    }
    if ($toFile) {
        if (-not "$toFile".Contains(":")) {
            $toFile = Join-Path $pwd $toFile
        }
        $webClient.DownloadFile($url, $toFile)
    } else {
        if ($bytes) {
            $webClient.DownloadData($url)
        } else {
            $webClient.DownloadString($url)
        }
    }
}

source http://blogs.msdn.com/mediaandmicrocode/archive/2008/12/01/microcode-powershell-scripting-tricks-scripting-the-web-part-1-get-web.aspx

notandy
Great! I've changed the ssh cmd to powershell and it worked out well.
Pablote
+1  A: 

cURL comes to mind.

curl -o homepage.html http://www.apptranslator.com/

This command downloads the page and stores it into file homepage.html. Thousands options available.

Serge - appTranslator
A: 

Just use Win32 api (1 line of code in C...)

can you post which api?
Dustin Getz