views:

693

answers:

7

What is the best protocol I can use to transfer a Big file, Which should be fast and reliable. it might have low bandwidth systems i need a file transmition across the india.the file size may be 100 to 500MB

+6  A: 

File Transfer Protocol

BitTorrent

BitTorrent is a peer-to-peer file sharing protocol used for distributing large amounts of data.

List of file transfer protocols

rahul
+1 for the "list of FTP's". :-)
Cerebrus
+2  A: 

FTP

RaYell
+4  A: 

Even though FTP is the most efficient protocol for file transfer, it's pretty hard to implement. I would use HTTP. The support is built-in on most platforms and it's more resilient to firewalls.

ZZ Coder
It doesn't make sense to implement the protocol yourself, just use some existing library.
Adam Byrtek
A: 

HTTP is probably the way to go for small files and/or unsophisticated users. Having to configure a firewall will stop many users cold. Almost every network allows http transfers over port 80 with no special configuration.

You did say Big files, though. You can write the transfer code such that it uses range transfers to retry interrupted downloads.

Someone has probably written a file transfer library that handles partial transfers and retries automatically, though I don't know of one.

Mark Bessey
A: 

This might be of some interest related to file transfer and .NET, not that the original post mentioned .NET in any way shape or form.

Sending Files in Chunks with MTOM Web Services and .NET 2.0 By Tim Mackey
How to send large files across web services in small chunks using MTOM (WSE 3)

Just note that you need to install Web Service Enhancements 3.0 (you will find relevant links in the article).

Have an otherwise good day sir!

CS
A: 

Well I think it's best to use the TPC protocol. It's reliable and the UDP isn't. While UDP is faster as a best-effort protocol is not "safe". P2P programs use UDP though since it's faster and really don't care that much about package lose. FTP use TCP usually. So I'd suggest to implement TCP and program over sockets. Use a port like 120000 or something because those are free.

Kevin_Jim
+1  A: 

Rsync is a great fit for this problem. It's designed to send/update big files remotely.

  • Runs from the command-line so you can launch it fairly easily as an external process.
  • It can synchronize two remote file systems.
  • It handles large file sizes.
  • It has a clever algorithm that seeks to only copy differences in files around.
  • It's widely implemented and is open source.
  • It has a throttling capability so you can limit how much of a WAN connection you're using up with the transfer so you can tune it to avoid starving other processes of connectivity.
  • internally uses zlib to compress transferred data blocks

original site: http://samba.anu.edu.au/rsync/

securing rsync with ssh: http://www.linux.com/archive/feature/113847

detailed features: http://en.wikipedia.org/wiki/Rsync

cartoonfox