views:

36

answers:

1

I'm currently working on a large real-time OLAP application. All data are hold in RAM (a few gigabytes) and the common tasks involve brute scanning over the large quantity of that data (which is fine). The results of processing are exposed via a Web service (singleton/multithreaded) and presented using Silverlight-based client.

The problem is that various customers need different functionality/algorithms and I don't know how to provide extensibility on the server-side. For the client side (Silverlight) I can use MEF/Prism, but I'm not sure what would be a good approach to tackle this problem on the server.

Please note that ideally other web-services should have a direct access (i.e. without marshaling) to the data of the main/current service which holds the large data model.

Are there any:

a) frameworks/libraries

b) patterns

c) good pracitces

which would help me to modularize the application and make the selection of desired modules and their deployment relatively easy?

+1  A: 

Sounds to me like Dependency Inversion is required: isolate logical parts of the system (algorithms, etc) by defining interfaces, then use a DI / IoC framework to load the desired implementation at runtime (or on application start, etc).

I haven't used Ninject, but plenty of people love it, so you could try that; there's also Spring.Net.

Good Practices:

  • Ensure you have clear precise logging so you know what's being used and when.
  • Think about whether you want a 'default' implementation to load if the desired one fails, or whether you deliberately want to fail so that the wrong data isn't returned by mistake (such as the use of a different algorythm).
  • I've found that using attributes to decorate injectable modules is really helpful (especially in a web-based system that you don't have immeadiate access to) one reason for this is that you can build pages or controls that list all the known / available implementations at runtime.

You can also use the attribute approach to build a UI that lets users select which one they want; I use it for an open source web-application framework I built: http://www.morphological.geek.nz/Morphfolia/Capabilities/AttributeDriven.aspx

Adrian K
With an OLAP back-end? DI might be a logical way to plug-in, but I've yet to see one with great performance characteristics.
jro
I haven't done much DW work, but I thought you needed to know the questions you wanted to ask, as that influences it's design; in that context I wouldn't expect people to be able to chop and change algorythmns. I'm not an expert on DI as far as performance goes, but I would have thought the perfoprmance would have been reasonable - depending on your approach.In your case, if you're after specific performance requirements and flexibility you might find your in green 'design patterns' pastures.
Adrian K
We are already using DI (Unity) internally on the server and client side. Each plugin should have at least two assemblies: one for the server side which will be loaded in the same appdomain where the main data are being held (to avoid marshaling), and another one which is loaded on the client side (Silverlight) to expose additional functionality to the user. Both assemblies can be automatically discovered and if there are present in Plugins folder they can be loaded. We just don't know how to automatically expose new functionality in web service which glues plugin's server/client assemblies.
Karol Kolenda