tags:

views:

60

answers:

2

Here is my situation: I have J2EE single page application. All client-server communication is AJAX based with JSON is used as format to exchange data. One of my request takes around 1 min to calculate data required by client. Also this data is huge(Could be > 20 MB). So it is not possible to pass entire data to javascript in one go. So for this reason I am only passing few records to client and using grid to display data with paging option.

Now when user clicks on next page button, I need to get more data. My question is how do I cache data on server side ? I need this data only for one user as a time. Would you recommend caching all data one first request using session id as key ?

Any other suggestions ?

A: 

The cheapest (and not so ineffective way of caching data) in a J2EE web application is to use the Session object like you intend to do. It's ineffective since it requires the developer to ensure that the cache does not leak memory; so it is upto to the developer to nullify the reference to the object once the object is no longer needed.

However, even if you wish to implement the poor man's cache, caching 20MB of data is not advisable, as it does not scale well. The scalability question rises when multiple users utilize the same functionality of the application, in which case 20MB is a lot of data.

You're better off returning paginated "datasets" in the form of JSON, based on the ValueList design pattern. Each request for the query of data will result in partial retrieval of data, which is then sent down the wire to the client. That way, you never have to cache the complete results of the query execution, and also you can return partial datasets. It is entirely upto to you, as to whether you want to cache; usually caching is done for large datasets that are utilized time and again.

Vineet Reynolds
I need to keep this data in cache as long as user is active (as long as user session is valid). Performance is the only reason I am keen to cache such large data. Requesting partial data is as expensive as retrieving all dataset. So during each requests, it will take around 1 min or so which is not acceptable. Anyways, will continue to look for better alternative. Thanks.
ashish
A: 

I am assuming you are using DB backend for that. I'd use limits to return small chunks of data, most DB vendors have solution for this. That would make your queries faster, and also most of JS fameworks with grid type of components will support paginating results(ExtJS for example).

If you are fetching data from 3rd party and passing it on (with some modifications or not) I'd still stick to the database and use such workflow: pool data from 3rd party, save in db, call from your widget small chunks required by customers.

Hope this helps.

Greg
I have no choice but to fetch all data in one go on first request. Also populating this data is expensive and thats why i do not want to do processing each time. So limiting with DB query is not an option. Thanks.
ashish