tags:

views:

477

answers:

6

I am trying to read a file from a server using ssh from python. I am using paramiko to connect. I can connect to the server and run a command like 'cat filename' and get the data back from the server but some files I am trying to read are around 1 GB or more in size.

How can I read the file on the server line by line using python?

Additional Info: What is regularly do is run a 'cat filename' command and store the result in a variable and work off that. But since the file here is quite big, I am looking for a way to read a file line by line off the server.

EDIT: I can read a bunch of data and split it into lines but the problem is that the data received in the buffer does not always include the complete lines. for eg, if buffer has 300 lines, the last line may only be half of the line on the server and the next half would be fetched in the next call to the server. I want complete lines

EDIT 2: what command can I use to print lines in a file in a certain range. Like print first 100 lines, then the next 100 and so on? This way the buffer will always contain complete lines.

+2  A: 

What do you mean by "line by line" - there are lots of data buffers between network hosts, and none of them are line-oriented.

So you can read a bunch of data, then split it into lines at the near end.

ssh otherhost cat somefile | python process_standard_input.py | do_process_locally

Or you can have a process read a bunch of data at the far end, break it up, and format it line by line and send it to you.

scp process_standard_input.py otherhost
ssh otherhost python process_standard_input.py somefile |  do_process_locally

The only difference I would care about is what way reduces the volume of data over a limited network pipe. In your situation it may, or may not matter.

There is nothing wrong in general with using cat over an SSH pipe to move gigabytes of data.

Joe Koberg
A: 
#!/usr/bin/env python
import paramiko
import select
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('yourhost.com')
transport = client.get_transport()
channel = transport.open_session()
channel.exec_command("cat /path/to/your/file")
while True:
  rl, wl, xl = select.select([channel],[],[],0.0)
  if len(rl) > 0:
      # Must be stdout
      print channel.recv(1024)
Noah
Good example of paramiko, but again highlights the non-line-oriented nature of this kind of task.
Joe Koberg
Just keep reading it until you get a newline or other line-terminating character.
Noah
+7  A: 

Paramiko's SFTPClient class allows you to get a file-like object to read data from a remote file in a Pythonic way.

Assuming you have an open SSHClient:

sftp_client = ssh_client.open_sftp()
remote_file = sftp_client.open('remote_filename')
try:
    for line in remote_file:
        # process line
finally:
    remote_file.close()
Matt Good
+1, MUCH better than messing with cat (for all that I like felines!-).
Alex Martelli
A: 

You can write your own linebuffer/generator something like this

def linebuffer(source):
    data=""
    i=0
    while source:
        try:
            i=data.index('\n',i)+1
            yield data[:i]
        except ValueError:
            newdata = source.recv(1024)
            if not newdata: break
            data=data[i:]+newdata
            i=0
gnibbler
A: 

Here's an extension to @Matt Good's answer:

from contextlib     import closing
from fabric.network import connect

with closing(connect(user, host, port)) as ssh:
    with closing(ssh.open_sftp()) as sftp:
         with closing(sftp.open('remote_filename')) as f:
              for line in f:
                  process(line)
J.F. Sebastian
I've never seen contextlib.closing before. So this lets you turn anything with a close() method into a Context Manager-like thing, notwithstanding that it may not have \_\_enter\_\_ and \_\_exit\_\_?
hughdbrown
@hughbrown: Yes. Any object with `.close()` method will do. The implementation of `closing` is trivial, see http://svn.python.org/view/python/trunk/Lib/contextlib.py?view=markup
J.F. Sebastian