views:

61

answers:

2

I have got myself saddled with an existing application - which attempts do do a lot of things by itself and does most of it badly!

This application is an ASP.net 3.5 Web application having the typical UI -> Business --> DAL kind of pattern.

The problem i am faced with is the fact that when multiple users are requesting the same information, even if only to view it, the same business objects are being instantiated, the same data is being retrieved all the way from the DB.

Currently, there is no single point of instantiation of these objects like a factory implementation or something on those lines. The object instantiation is just scattered all over the place.

One option i have been thinking of is to encapsulate the entire instatiation logic in factories - and build in logic in the factories to cache objects and return accordingly.

Another option that i am considering - I have used CSLA in the past in one of my projects and i think i remember that it used to do caching of business objects. How it used to achieve that, however, is not something that i am aware of. I am considering the use of some framework like CSLA which provides such a facility for caching my business objects.

However, both the above options are quite invasive and would involve a major if not complete revamp impacting the entire business layer, which is a major concern to me, as there is NO (read zilch, nada, zero, the big -O) automated unit tests for any part of the code.

What i wanted to find out is :

  1. Does anyone know of any frameworks / products which provide a less invasive way for me to be able to achieve this without redesigning / rewriting the entire business layer?

  2. In absence of 1, are there any easier to implement suggestions that people might have compared to the two options above?

And No, quitting the company is not an option - not just yet anyways! :-)

+3  A: 

Find the worst offenders via profiling and memoize their calls.

response to comment

Memoization is indifferent to usage patterns it is really just replacing an expensive operation (hitting the database) with a less expensive lookup in core. The pattern of usage certainly impacts the effectiveness of the memoization within the system but does not inform how memoization code is written.

Here's a cheap example:

define square_root(n):
   if cache[n] does not exist then
      cache[n] = compute_root(n)

   return cache[n]

This means that the first time you call square_root(1729) it will take time to compute the root. All subsequent calls of square_root(1729) will only require the cache lookup. If that call is never made again, you'll have wasted the caching time and space.

Thus, the memoization of square_root only requires that there be a unique result for n. Whether it is useful optimization is a factor of actual usage.

msw
+1 most bang for the buck.
marr75
@msw - Wouldnt the worst offenders depend on the usage patterns of the end users - Would this need me to first figure out how the application is intended to be used or am i not understanding this correctly? I didnt mention it earlier bit but the application is still not live yet..its just been in development for the past 2 years :(-
InSane
If it isn't live, then I hope you have some test data and I hope it is intended to be somewhat representative of live data. If not, then we are left with the old Zen question: "If a program is never deployed, does it have performance problems?" http://careers.stackoverflow.com/?campaign=PrettyHeader
msw
@msw - i do have reasonable test data which is representative but not much information about how users typically will use the screens - i.e if they create a new entity today, does it get queries / looked up often by a lot of other people in other departments / does it get changed often etc. I have not done memoizing before but based on the link u gave and my understanding of it, that kind of information would be needed for memoizing. Is that understanding correct?
InSane
@msw - also, your point about whether it is a performance issue at all is a valid one. Since i have been brought in to clean up, to some extent, i am looking at some pro-active optimization though!! But, this is definitely something i need to re-think as to whether its worth the time / effort at this point
InSane
@In Sane: see response above
msw
@msw - thanks for the inputs! I think i will try out this approach.
InSane
A: 

Caching is baked into ASP.Net, and in .Net 4.0 (I know you haven't switched yet) the same caching can be used anywhere, inside or outside of ASP.Net classes.

You'll still need to get your object instantiation under control. I suggest a dependency injection framework. I like autofac, this will allow you to just request new objects from some dependency injected provider, and decide how you're going to handle caching in this provider.

You came to us with one problem, and I've given you 3 (learn how to use and implement a dependency injection framework, learn how to use asp.net caching, use a cache strategy to solve your performance issues), but this is probably your most flexible and maintainable bet.

marr75
@marr75 - yes you have given me a lot to look over!! Thanks for the inputs though! I know that there is probably no quick and easy way out here - though i am still hoping for one :-)
InSane