To get the file size for every file in a list of files, I'm using the following code:
foreach (String f in files)
{
UriBuilder ftpUri = new UriBuilder("ftp", ftpServer, -1, ftpPfadZuLogDateien + "/" + f);
FtpWebRequest ftpclientRequest1 = (FtpWebRequest)WebRequest.Create(ftpUri.Uri);
ftpclientRequest1.Method = WebRequestMethods.Ftp.GetFileSize;
ftpclientRequest1.Credentials = new NetworkCredential(ftpLoginName, ftpPassword);
FtpWebResponse response1 = (FtpWebResponse)ftpclientRequest1.GetResponse();
long filesize = response1.ContentLength;
response1.Close();
// store the file size somewhere
}
If there are only a few files in the list, this usually works. But after some of these requests (sometimes 10, sometimes 100) in a row, GetResponse() will throw an error 503 (Bad Sequence of Commands).
What is this error trying to tell me? Am I querying too fast? Forgetting to clean up any resource?
And what can I do about this?
additional info:
Setting KeepAlive=false on the connection makes it fail on the second request with error 550 (file not found/access denied?).
Setting UsePassive=false did not change anything.
Setting UseBinary=true did not change anything.
Hitting my head on the keyboard did not change anything.
[Update] beckr.org provided an answer - since it is hidden away behind a link and a lot of text, here the short version: i changed the source, so I'd reuse the NetworkCredentials:
NetworkCredential myCredentials = new NetworkCredential(ftpLoginName, ftpPassword);
foreach (String f in files)
{
UriBuilder ftpUri = new UriBuilder("ftp", ftpServer, -1, ftpPfadZuLogDateien + "/" + f);
FtpWebRequest ftpclientRequest1 = (FtpWebRequest)WebRequest.Create(ftpUri.Uri);
ftpclientRequest1.Method = WebRequestMethods.Ftp.GetFileSize;
ftpclientRequest1.Credentials = myCredentials;
FtpWebResponse response1 = (FtpWebResponse)ftpclientRequest1.GetResponse();
long filesize = response1.ContentLength;
response1.Close();
// store the file size somewhere
}
This way everything works like it should.