views:

424

answers:

4

I want to make 10 asynchronous http requests at once and only process the results when all have completed and in a single callback function. I also do not want to block any threads using WaitAll (it is my understanding that WaitAll blocks until all are complete). I think I want to make a custom IAsyncResult which will handle multiple calls. Am I on the right track? Are there any good resources or examples out there that describe handling this?

A: 

Take a look at this article Asynchronous Calls in .NET - Polling or Waiting for Completion

alejandrobog
A: 

I think you are better off using the WaitAll approach. Otherwise you will be processing 10 IAsyncResult callbacks, and using a semaphore to determine that all 10 are finally complete.

Keep in mind that WaitAll is very efficient; it is not like the silliness of having a thread "sleep." When a thread sleeps, it continues to use processing time. When a thread is "de-scheduled" because it hit a WaitAll, then the thread no longer consumes any processor time. It is very efficient.

Brent Arias
In the context of the IIS thread pool, it the thread still in use or would it be put back in the pool like it is for an asynchronous call?
aepheus
+1  A: 

In .NET 4.0 there's a nice parallel Task library that allows you to do things like:

using System;
using System.Linq;
using System.Net;
using System.Threading.Tasks;

class Program
{
    public static void Main()
    {
        var urls = new[] { "http://www.google.com", "http://www.yahoo.com" };

        Task.Factory.ContinueWhenAll(
            urls.Select(url => Task.Factory.StartNew(u => 
            {
                using (var client = new WebClient())
                {
                    return client.DownloadString((string)u);
                }
            }, url)).ToArray(), 
            tasks =>
            {
                var results = tasks.Select(t => t.Result);
                foreach (var html in results)
                {
                    Console.WriteLine(html);
                }
        });
        Console.ReadLine();
    }
}

As you can see for each url in the list a different task is started and once all tasks are completed the callback is invoked and passed the result of all tasks.

Darin Dimitrov
+1  A: 

I like Darin's solution. But, if you want something more traditional, you can try this.

I would definitely use an array of wait handles and the WaitAll mechanism:

static void Main(string[] args)
{

    WaitCallback del = state =>
    {
        ManualResetEvent[] resetEvents = new ManualResetEvent[10];
        WebClient[] clients = new WebClient[10];

        Console.WriteLine("Starting requests");
        for (int index = 0; index < 10; index++)
        {
            resetEvents[index] = new ManualResetEvent(false);
            clients[index] = new WebClient();

            clients[index].OpenReadCompleted += new OpenReadCompletedEventHandler(client_OpenReadCompleted);

            clients[index].OpenReadAsync(new Uri(@"http:\\www.google.com"), resetEvents[index]);
        }

        bool succeeded = ManualResetEvent.WaitAll(resetEvents, 10000);
        Complete(succeeded);

        for (int index = 0; index < 10; index++)
        {
            resetEvents[index].Dispose();
            clients[index].Dispose();
        }
    };

    ThreadPool.QueueUserWorkItem(del);

    Console.WriteLine("Waiting...");
    Console.ReadKey();
}

static void client_OpenReadCompleted(object sender, OpenReadCompletedEventArgs e)
{
    // Do something with data...Then close the stream
    e.Result.Close();

    ManualResetEvent readCompletedEvent = (ManualResetEvent)e.UserState;
    readCompletedEvent.Set();
    Console.WriteLine("Received callback");
}


static void Complete(bool succeeded)
{
    if (succeeded)
    {
        Console.WriteLine("Yeah!");
    }
    else
    {
        Console.WriteLine("Boohoo!");
    }
}
Scott P