tags:

views:

494

answers:

2

We are developing a project that involves about 10 different WCF services with several endpoints each. One of the services keeps a few big tables of data cached in memory.

We have found we need access to that data from another service. Rather than keeping 2 copies of the cache, I'd like to be able to share those tables across all services.

I have done some research and found some articles about using an IExtension attached to the servicehosts to store the shared data.

Provided that all the services are running under the same web site, will that work? And is it the right approach? Or should I be looking elsewhere?

A: 

Provided all your services are part of the same application there doesn't seem to be any reason why you can't share the cache directly via a shared object reference. The simplest way of doing this is via a static field.

If you choose this approach, one thing to be very careful about is thread safety. If your cache is concurrently accessed via two WCF sessions, you must ensure that the two sessions are not going to interfere with each other by both changing the cache at the same time. If the cache is read-only, your need to do this is lessened, but you still might need to synchronrise initialisation of the cache.

Jimmy McNulty
I have often used the static variable method for regular asp.net applications, but i'm concerned about the life cycle of the services. Is the cache guaranteed to be initialized if, e.g., service A tries to access the cache in service B which hasn't been called (activated) yet?
axel_c
You would need to put the initialisation code into the static method, or the cache controller itself. That way, it is encapsulated away from both service A and B. Neither depend on each other to initialise the cache, the first one to access it causes a lazy initialisation.
Jimmy McNulty
+1  A: 

If the data that you're caching is required by more than one service, it sounds like - from a Service Oriented Architecture perspective, anyway - that it doesn't belong in either of services you have calling it.

If the data being cached isn't really related to either service, but is something that both services need, then perhaps it belongs in it's own seperate service. Have you considered encapsulating your cache in a third service, and performing a service-to-service call to retrieve the data you need? Benefits include...

  1. It solves your original dilemma, avoiding the need to read the whole cache from the database several times;
  2. It encapsulates the cache in one place for easy maintainance/change later.
  3. It allows you to abstract the implementation of the cache away from the other services by putting another service interface in the way.

All in all, I'd suggest that's the best approach. The only downside is the extra overhead of making the service-to-service call, but that surely outperforms having to read the whole cache from the database.

Alternatively, if the data in your cache is very closely related to BOTH of the services that are calling the cache, i.e. both services add/change the data in the cache, etc. then perhaps the two existing services should be combined into a single service.

If what I'm saying is making some sense, then then principle of SOA I'm drawing on is Service Autonomy.

sgreeve
Well, it is a medium-sized enterprise application. I agree about the service autonomy point. The data was originally meant to be read from DB from several services but because of performance concerns we want to move it to an in-memory cache.The data is 'core' to many services so, althought unorthodox, I think the shared cache makes sense.
axel_c
Thanks for the clarification. Personally, I would certainly encapsulate the cache in another service and implement the service-to-service calls. I'll update my answer accordingly.
sgreeve