views:

1342

answers:

7

Not really a programming question, but relevant to many programmers...

Let's say I have opened an SSH session to another computer.

remote:html avalys$ ls
welcome.msg index.html readme.txt
remote:html avalys$

Is there any command that I can type in my remote shell that will immediately transfer one of the files in the current directory (e.g. welcome.msg) to my local computer, i.e.

remote:html avalys$ stransfer welcome.msg
Fetching /home/avalys/html/welcome.msg to welcome.msg
/home/avalys/html/welcome.msg 100% 23KB 23.3KB/s 00:00
remote:html avalys$

The only way I know of to do this is to open a parallel SFTP session and CD to my current directory in the SSH session, which is a real PITA when administering a server remotely.

EDIT: I am aware of the possibility of using a reverse sftp/scp connection, but that involves more typing. It would be great if I could type just the name of some command (e.g. "stransfer"), and the file(s) I want transferred, and have it Just Work.

A: 

At school if I am transferring a file from one linux system to another, I usually remote using ssh and then use scp to transfer the file from the remote system back to my computer. Not sure if that is what you are looking for or if that is even applicable in your case but that is usually my way of getting around having to open another terminal window.

Just to clarify if your not familiar with scp you would still have to specify the location of that you would like to put the file on your computer but for me anyways this is generally an easier thing to remember than the location of the file on the remote computer. Here is a link that may be helpful: http://www.computerhope.com/unix/scp.htm

Bill
+2  A: 

You could set up such an inverted transfer connection w/ ssh -Rport:127.0.0.1:22 user@host for scp back.

Use scp user@host:port to access it.

Joshua
Nice idea. Maintaining a .ssh/config file on each of the machines involved will save you even more typing.
innaM
+1  A: 

If you are into patching things (that IMHO shouldn't be patched), take a look at ssh-xfer

innaM
A: 

I'll often do things like this to avoid creating an extra session:

local:~$ ssh remote ls

data/ work/

local:~$ ssh remote ls data

welcome.msg index.html readme.txt

local:~$ scp remote:data/welcome.msg

What makes this doable is:

  • Setting up keys so that you don't have to enter the password for each command
  • Using Emacs as my shell, so that editing previous commands and copy/pasting is easy
Kristopher Johnson
In order to "avoid creating an extra session", you're building up and tearing down 3 connections, each with exchange of cryptographic keys. I'm not seeing the benefit here.
Paul Tomblin
I thought the "problem" was that the original poster didn't want to have to start a second shell. I'm not worried about the connections, but if that was the question, then this isn't a good answer.
Kristopher Johnson
+1  A: 

You could write a bash script named stransfer that would take a filename argument, then it would interject the filename in the scp command, assuming the server and path to files on server don't change.

Or if the file is always the same you could create an alias in your ~/.bashrc file.

alias getwelcome='scp avalys@remotehost:/home/avalys/html/welcome.msg .'

Mark Robinson
+1  A: 

I had the same thought that Paul Tomblin wrote in a comment to the question. Old terminal sessions used to use x-, y-, and z-modem protocols and tools (sz and rz for the z-modem variants) to achieve something like this. I'm not sure if these will work over a ssh session, but it might be worth a try.

Fink supplies a lrzsz package with these tool on Mac OS X.

Making this a community wiki because I'd feel bad for getting rep after Paul got there first...

dmckee
A: 

Here is my preferred solution to this problem [as given on a duplicate question I asked]. Set up a reverse ssh tunnel upon creating the ssh session. This is made easy by two bash function: grabfrom() needs to be defined on the local host, while grab() should be defined on the remote host. You can add any other ssh variables you use (e.g. -X or -Y) as you see fit.

function grabfrom() { ssh -R 2202:127.0.0.1:22 ${@}; };
function grab() { scp -P 2202 $@ [email protected]:~; };

Usage:

localhost% grabfrom remoteuser@remotehost
password: <remote password goes here>
remotehost% grab somefile1 somefile2 *.txt
password: <local password goes here>

Positives:

  • It works without special software on either host beyond OpenSSH
  • It works when local host is behind a NAT router
  • It can be implemented as a pair of two one-line bash function

Negatives:

  • It uses a fixed port number so:
    • won't work with multiple connections to remote host
    • might conflict with a process using that port on the remote host
  • It requires localhost accept ssh connections
  • It requires a special command on initiation the session
  • It doesn't implicitly handle authentication to the localhost
  • It doesn't allow one to specify the destination directory on localhost
  • If you grab from multiple localhosts to the same remote host, ssh won't like the keys changing

Future work: This is still pretty kludgy. Obviously, it would be possible to handle the authentication issue by setting up ssh keys appropriately and it's even easier to allow the specification of a remote directory by adding a parameter to grab()

More difficult is addressing the other negatives. It would be nice to pick a dynamic port but as far as I can tell there is no elegant way to pass that port to the shell on the remote host; As best as I can tell, OpenSSH doesn't allow you to set arbitrary environment variables on the remote host and bash can't take environment variables from a command line argument. Even if you could pick a dynamic port, there is no way to ensure it isn't used on the remote host without connecting first.

Nick