I'm writing software that uploads and downloads a number of files using ftp with a remote server. Download speeds are fine and stay consistent at upwards of 4mb/s. Small uploads are instantaneous. The problem I'm experiencing is when my program uploads a large 40Mb zip file, i'm getting extremely poor performance. It seems to upload in bursts (100-200Kb/s) and then delay for a second and do this repeatedly until the file eventually finishes uploading. Programatically downloading the file from the same server takes 30 seconds tops, uploading the same file to the same server using filezilla takes about the same amount of time. Uploading through the software can take up to 15 minutes. Something is clearly wrong.
I am using the starksoft ftp library to handle uploads/downloads from here: http://starksoftftps.codeplex.com/
Here is an example of the problematic code:
FtpClient ftp = new FtpClient(sourcecfg.Host);
ftp.MaxUploadSpeed = 0;
ftp.MaxDownloadSpeed = 0;
ftp.FileTransferType = TransferType.Binary;
ftp.DataTransferMode = TransferMode.Passive;
ftp.Open(sourcecfg.FtpUserName, sourcecfg.FtpPassword);
ftp.PutFile(backupTempPath, targetcfg.getFullPath() + "wordpress-backup.zip", FileAction.Create);
I've also tried using an overloaded version of PutFile that takes a Stream object instead of a path string. The results were unchanged.
Incidentals: I'm compiling in visual c# express 2008 in winxp inside of a virtualbox instance. I've tried both the debug and production exe's with no change in results.
The issues feels like a buffering or a throttling issue but while looking at the internal code of the ftp classes i don't see anything unusual and I'm specifically setting it not to throttle. Any suggestions or comments about this particular ftp component library?