Is there a windows command to copy or download files from a http url to the filesystem? I've tried with copy, xcopy and robocopy and they dont seem to support http urls.
A:
I can't remember any command line utility for this. Maybe you can implement something similar using JavaScript (with WinHttpRequest) and running it like that:
wscript your_script.js
Or just install msys with wget.
begray
2009-03-03 19:02:38
+3
A:
You can use a powershell script to accomplish this.
Get-Web http://www.msn.com/ -toFile www.msn.com.html
function Get-Web($url,
[switch]$self,
$credential,
$toFile,
[switch]$bytes)
{
#.Synopsis
# Downloads a file from the web
#.Description
# Uses System.Net.Webclient (not the browser) to download data
# from the web.
#.Parameter self
# Uses the default credentials when downloading that page (for downloading intranet pages)
#.Parameter credential
# The credentials to use to download the web data
#.Parameter url
# The page to download (e.g. www.msn.com)
#.Parameter toFile
# The file to save the web data to
#.Parameter bytes
# Download the data as bytes
#.Example
# # Downloads www.live.com and outputs it as a string
# Get-Web http://www.live.com/
#.Example
# # Downloads www.live.com and saves it to a file
# Get-Web http://wwww.msn.com/ -toFile www.msn.com.html
$webclient = New-Object Net.Webclient
if ($credential) {
$webClient.Credential = $credential
}
if ($self) {
$webClient.UseDefaultCredentials = $true
}
if ($toFile) {
if (-not "$toFile".Contains(":")) {
$toFile = Join-Path $pwd $toFile
}
$webClient.DownloadFile($url, $toFile)
} else {
if ($bytes) {
$webClient.DownloadData($url)
} else {
$webClient.DownloadString($url)
}
}
}
notandy
2009-03-03 19:14:53
Great! I've changed the ssh cmd to powershell and it worked out well.
Pablote
2009-03-03 19:45:57
+1
A:
cURL comes to mind.
curl -o homepage.html http://www.apptranslator.com/
This command downloads the page and stores it into file homepage.html. Thousands options available.
Serge - appTranslator
2009-03-03 19:26:22