views:

139

answers:

3

Hi,

Is there an alternative to scp, to transfer a large file from one machine to another machine by opening parallel connections and also able to pause and resume the download.

Please don't transfer this to severfault.com. I am not a system administrator. I am a developer trying to transfer past database dumps between backup hosts and servers.

Thank you

+2  A: 

Take a look at rsync to see if it will meet your needs.

The correct placement of questions is not based on your role, but on the type of question. Since this one is not strictly programming related it is likely that it will be migrated.

Dennis Williamson
+2  A: 

You could try using split(1) to break the file apart and then scp the pieces in parallel. The file could then be combined into a single file on the destination machine with 'cat'.

# on local host
split -b 1M large.file large.file. # split into 1MiB chunks
for f in large.file.*; do scp $f remote_host: & done

# on remote host
cat large.file.* > large.file
Mike K
A: 

Your problem description is lacking.

Why does it matter that the file is large?

What problems do you believe will be solved by using "parallel connections"?

What problems are you running into that you need to be able to pause and resume the transfer?

You should probably be talking to your local system administrator. She knows how to copy large files between machines.

Mark Edgar