views:

204

answers:

2

We would like to cache data for some dropdown lists on the client machine to reduce the transfer of slow changing info. This data will be used on multiple pages within a business app so for example a customer list of 5,000 customers is frequently used either for searching, creating a sales order, or other purposes. We've tried both full load of the data with the page as well as paged database access and loading the grid on demand which brings back only 25-50 records. Users are complaining about the performance of both options and want it faster so we are looking for options on how to cache the data at the client for reuse.

We've seen some notation that online regarding caching that perhaps a JS file could be generated by the server and cached locally to be the datasource for the ddl but never found how to do this.

Any suggestions? Or other options that you would recommend?

Note: we'll need to address expiry of this too as some of the lists will change a few times per day and should be refreshed (not on a timer but when they change based on response codes).

+1  A: 

You could do this via javascript, convert the list to JSON and serve it up via a script tag from server cache. I say JSON because something like jQuery or any other javascript framework can make easy use of it.

e.g. in the html/aspx:

<script src="CustomerList.aspx?Refresh=12308798798023745" type="text/javascript"></script>

The Refresh being the DateTime.Now.Ticks of the last time it changed on the server. So while you're putting this tag in every page, the client would fetch it only once, and again when it actually changed on the server.

Something like this server-side:

public class Cache {
  public static List<Customer> Customers { get; set; }
  public static DateTime LastRefresh { get; set; }
  public static void RefreshCustomers {
    //Populate Customers, Customers = blah;
    LastRefresh = DateTime.Now;
  }
}

In the page:

ScriptTagwithRunAtServer.Src = "CustomerList.aspx?Refresh=" + Cache.LastRefresh.Ticks;

Any way you render that tag would be fine, just an example. The CustomerList.aspx page would just render the Cache.Customers into json form...and your page would run a little javascript to make whatever use of it.

Just an idea, please comment if it interests you and I'm leaving something out.

Nick Craver
This would be my approach if I were to go ahead with this (and TBH, I would try and avoid it and keep it server-side). However, I would also consider plumbing in some kind of expiry time to add *some* element of "control".
Rob Cooper
@Rob - This is the idea of the `?Refresh=####`, it tells the client to refresh when updated JSON is available...or you mean cache expiration?
Nick Craver
Nick - this is of interest. Is this then cached at the client side as well? Could we get a way to cache the customer list once on the client side and reuse that cache on multiple pages (without going back to the server for the customer list), i.e. on the Orders page, the Order search page, ...?
asp2go
@asp2go - Yes, this approach would use the same javascript file cached in client browser cache across all pages, the browser would't care about which page that javascript was being used for. The cached .js file can contain the jQuery/javascript that populates what is using that data (select list, search box, whatever) as well.
Nick Craver
@Nick - Thanks for the info. This sounds just like what we are looking for - just not really sure how to implement so we'll do some searching and try to get this to work.If anyone has more examples or detail on how to set this up that would be great!
asp2go
+2  A: 

Before you attempt to start caching things on the client, which is an unpredictable way of doing things for the simple reason that you have no control over how their browser is set up to cache (what if they have cache set to "never"?), there are a lot of things you should look at.

  1. Have you looked at caching on the server first? You can use the OutputCache and vary it by parameter (the grid page number, for example) for partial caching of all the viewed datasets.

  2. What exactly is slow about the user experience? Analyze it with firebug or fiddler and see how many requests are being made to the server per page. Simply minimizing the number of HTTP requests being sent (by merging and minifying CSS and javascripts for example) will do wonders for the response time.

  3. What's in your ViewState? Reduce the amount of data being sent back and forth in ViewState for faster page loading.

  4. Is the data fetching routine on the server taking too long? Optimize it, and perhaps cache the data on the server so you don't have to go out to the database as often.

Maybe you've tried all this already, but I thought I'd throw it out there because relying on client-side caching can be a frustrating excercise.

womp
Thanks for the suggestion - we have tried all of these points and while they help there is still an issue. (1) yes - caching on the server where possible has been done (some search pages are too dynamic to define iterations of fixed parameters). (2) For example, on a screen that you create a customer order there is a basic header and detail grid. Loading the customers all at once (>5000) is very slow but even loading 50 customers on demand using advanced Sql paging through a custom object takes 3-4 seconds due to slow network connections.
asp2go
(3) Yes we've done this - the initial page size is about 175kb with most viewstate off on controls, the Ajax posts to retrieve customers are 1kb - On a high-speed server with high bandwidth connected users this performs extremely well and scrolling through the dropdowns with 'virtual paging' even feels instantaneous, but unfortunately with the scenario we have there is a 3-4 second response for each request so filling in just the order header with multiple dropdowns is frustrating to users.(4) It isn't the fetching that is a problem - Sql Profiler shows a 10-15 millisecond time.
asp2go
If we include the data (just Id and Name) for just three dropdowns (1000-5000 per ddl) in the page at page load then the size jumps closer to a Mb and response is over 20 seconds on the user's network to load. Of course after that the page works fine but we're looking for fast page load AND fast dropdown response.The network/bandwidth upgrade cost in this scenario is extremely high also - so that can't be used to solve the issue.
asp2go