tags:

views:

62

answers:

3

Hey,

I want to copy files from one location(say pittsburgh) to another location(say melbourne) using the network share. The normal file copy takes more time for copying the files(usually in GB) Can I use webservices to improve the performance or is there any other alternative?

Thanks

+4  A: 

Webservices might work, but it would be better to use a protocol designed for transferring files across the Internet, such as FTP.

There is a bunch of FTP libraries and tutorials out there.

Webservices do not have any special properties that will speedup connections and file transfers. Lag time and throughput will still be the same.

Oded
even FTP takes longer time for the file transfer, usually in GB's
superstar
Maybe your network is just saturated. Have you profiled it while transfering a file?
asawyer
"longer" is a relative term, longer than what?
Neil N
@superstar - GBs will always take a large amount of times, in particular across continents...
Oded
@Oded: that is true. Thats the reason i asked for an alternative to improve the performance (if any)
superstar
@superstar - There is no magic. Compression may help. And of course, never underestimate the bandwidth of a truck full of blue ray disks ;)
Oded
Oded, there's no magic, but there are definitely ways to improve upon FTP.
Steven Sudit
@Oded: okay, so webservices is out of question here(not a big change in performance) right?
superstar
@Steven Sudit - I am sure, but in the context of the question, webservices are a non starter.
Oded
@superstar - pretty much. Perhaps overnighting a burned DVD with a currier would be more cost effective.
Oded
@Oded: I think we're all in agreement regarding web services. As for the overnight courier, it has great bandwidth but poor latency. :-)
Steven Sudit
+1  A: 

FTP should still be king of file transfers, transfering files is what it was designed to do. I dont think you can get much better than that across the internet.

Neil N
Sure you can. I linked to one example in my answer, and described another. FTP is nothing great.
Steven Sudit
FTP clients can use multiple connections as well, making that point kinda moot.
Neil N
By "multiple", are we talking about "two"?
Steven Sudit
No, by multiple I mean "more than 1", a smart client will add more connections until it hits a point of diminishing returns.
Neil N
The FTP I'm familiar with has a control connection and a data connection. Would you please post the link to the RFC that defines an extension allowing multiple data connections?
Steven Sudit
@Steven Sudit: You dont need an RFC extension to have multiple connections, how do you think a client like FileZilla can connect to multiple servers at once? Does it use other ports? No. How do you think an FTP server can serve to multiple clients at once? Take two seconds to think how file-resume works. If a client requests a file using file resume multiple times, it can download the file in several synchronous peices, similar to how bit torrent works.
Neil N
Yes, of course a client can connect to multiple servers, but that's not what we're talking about. We're talking about sending a single file over multiple sockets at the same time. This is important because the overlapping overcomes latency. To the best of my knowledge, FTP does not support this. If I'm wrong, I want to learn, so please throw an RFC at me or some other reliable source.
Steven Sudit
+1  A: 

Web services are not particularly helpful here. While there are dedicated products that perform efficient, long-distance transfers of large files (see http://www.asperasoft.com/en/technology_sections), the basic trick is to overcome latency by sending chunks in parallel. This can be done over HTTP just fine.

Steven Sudit
Considering this is a correct answer, I find it odd that it's been downvoted, particularly without explanation. I guess someone here is being "strategic" because they've confused SO with a video game.
Steven Sudit
Here's an upvote to shatter your assumptions. :)
Neil N
Damn you for upvoting me! :-)
Steven Sudit