views:

33

answers:

3

I'm looking for ways to gather files from clients. These clients have our software and we are currently using FTP for gathering files from them. The files are collected from the client's database, encrypted and uploaded via FTP to our FTP server. The process is fraught with frustration and obstacles. The software is frequently blocked by common firewalls and often runs into difficulties with VPNs and NAT (switching to Passive instead of Active helps usually).

My question is, what other ideas do people have for getting files programmatically from clients in a reliable manner. Most of the files they are submitting are < 1 MB in size. However, one of them ranges up to 25 MB in size.

I'd considered HTTP POST, however, I'm concerned that a 25 mb file would often fail over a post (the web server timing out before the file could completely be uploaded).

Thoughts?

AndrewG

EDIT: We can use any common web technology. We're using a shared host, which may make central configuration changes difficult to make. I'm familiar with PHP from a common usage perspective... but not from a setup perspective (written lots of code, but not gotten into anything too heavy duty). Ruby on Rails is also possible... but I would be starting from scratch. Ideally... I'm looking for a "web" way of doing it as I'd like to eventually be ready to transition from installed code.

A: 

Research scp and rsync.

Peter Loron
sadly, it looks like scp and rsync would both require making a shell connection. I can only get one username for a shell account on my host... and it has full access to my account's entire directory structure. Not the ideal scenario since this is something that has to be deployed on the client site. I could see the thing crashing out and somehow leaving shell access open to a client that was mad about the crash. Ahh!A good idea though!
AndrewG
A: 

You probably mean a HTTP PUT. That should work like a charm. If you have a decent web server. But as far as I know it is not restartable.

FTP is the right choice (passive mode to get through the firewalls). Use an FTP server that supports Restartable transfers if you often face VPN connection breakdowns (Hotel networks are soooo crappy :-) ) trouble.

The FTP command that must be supported is REST.

From http://www.nsftools.com/tips/RawFTP.htm:

Syntax: REST position

Sets the point at which a file transfer should start; useful for resuming interrupted transfers. For nonstructured files, this is simply a decimal number. This command must immediately precede a data transfer command (RETR or STOR only); i.e. it must come after any PORT or PASV command.

jdehaan
HTTP-PUT is a problem as it isn't supported by my host.FTP has the problem that it is frequently blocked/troublesome at our client sites. I deal with at least three clients a week that can't upload/download via FTP because of firewall/anti-virus/spyware software that is blocking it in some way shape or form. boo. Otherwise... I'd love to just stay with FTP! ;-)
AndrewG
A: 

One option is to have something running in the browser which will break the upload into chunks which would hopefully make it more reliable. A control which does this would also give some feedback to the user as the upload progressed which you wouldn't get with a simple HTTP POST.

A quick Google found this free Java Applet which does just that. There will be lots of other free and pay for options that do the same thing

Dave Webb