views:

50

answers:

2

I have a library, written in C#, containing one method:

Response CalculateSomething(Request);

The execution time of this method is relatively large, and there are a lot of responses that should be processed. I want to use a "cluster", spread this DLL to different machines (nodes) in this "cluster" and write some controller that will distribute responses to the nodes. There should be mechanism that perevent losing task because of node crush, load balancing.

Can someone suggest framework that addresses this issue?

P.S. There is a framework Qizmt written in C# but I think MapReduce is not good for the above scenario

A: 

Serioius application - then the money for the OS also plays part. Asssuming C# means Windows... dont pay for full licenses.

Check Windows HPC. Also has the management framework for managing all the nodes. http://www.microsoft.com/hpc/en/us/developer-resources.aspx

Something smaller?

Use queueing. Write the requests into a SQL queue (or msmq queue), have services on every system reacing to the queue, processing, sending out the results. No need for any additional framework. The free SQL Server version should be enough.

TomTom
I don't want to reinvent the bicycle, I am looking for small and powerful solution.
petkov_d
A: 

Since you do not mention why MapReduce is no good for you, I will go ahead and suggest that you reconsider your dismissal of said programming model. From the short description of your "algorithm", it seems like a fine candidate for MapReduce. Simply, let CalculateSomething() be your map() function, and use the identity function as reduce().

Unfortunately, I have no experience with Qizmt, so I am not able to tell you have to wire these things up, but I would expect it to be pretty trivial compared to a DIY solution.

Jørn Schou-Rode