When user clicks "Update" button on the page, it send a GET request.
Within GET request handler I need to update database using data from other web sites (I download XML chunks that I use to update database).
I use thread pool to fork the process. It works well on local box.
On the server (shared hosting with DiscountAsp.net) though it only works with up to 7 threads, and returns "the connection was reset" if I try 8+ threads.
Where does this limitation come from?
Is there an alternative solution to using threads?
The actual code that downloads the pages is:
var webClient = new WebClient();
webClient.Encoding = Encoding.UTF8;
webClient.Headers.Set("User-Agent", UserAgent);
string pageXml;
try {
pageXml = webClient.DownloadString(url);
}
catch (Exception ex) {
throw new ApplicationException("Failed to retrieve XML page from url: " + url, ex);
}
should I use webClient.DownloadStringAsync(url)
instead? or will it just do same thread-spawning internally to the same result?