views:

129

answers:

2

In order to backup large database partitions to a remote machine using SFTP, I'd like to use the databases dump command and send it directly over using SFTP to a remote location.

This is useful when needing to dump large data sets when you don't have enough local disk space to create the backup file, and then copy it to a remote location.

I've tried using python + paramiko which provides this functionality, but the performance much worse than using the native openssh/sftp binary to transfer files.

Does anyone have any idea on how to do this either with the native sftp client on linux, or some library like paramiko? (but one that performs close to the native sftp client)?

+2  A: 

If you have remote shell access (ssh), you can do something like the following:

fancy-sql-dump-command --to-stdout | ssh me@remotehost "cat > my-dql-dump.sql"

Google "pipe over ssh" for more examples, e.g. this example using tar.

Alphax
This will work on sftp servers which also offer ssh with a shell, and access to cat, but will not work for servers not offering a shell, or not offering "cat", that's why I wanted to go with native sftp
freddie
A: 

I'd recommend sshfs, which operates over SFTP protocol.

Some OS distributions have this packaged, for others you'll need to compile, for example on RedHat Enterprise Linux 5.4+ or its clones like CentOS:

sudo yum install fuse-devel glib-devel
sudo usermod -a -G fuse "$USER"

cd /tmp
tar xzf sshfs-fuse-2.2.tar.gz
cd sshfs-fuse-2.2
./configure
make
sudo make install

# relogin

mkdir /tmp/servername
sshfs servername:directory /tmp/servername/
ls /tmp/servername/
Tometzky
Unfortunetly, sshfs performance is not very good (think transferring >10GB files at a time). So thanks for the suggestion, but this does still not solve my problem.
freddie