views:

465

answers:

4

I currently have an application that gets hit with over 20,000 users daily and they mostly look at one data table. This data table is filled with about 20 rows but is pulled from a "datatable" in a db with 200,000-600,000 records of information in the table. Edit: These 20 rows are "dynamic" and do change if the user enters in any information via a text box.

I also currently hold user data along with profile data.

I am currently making about 4 call backs each time a datatable is displayed and I am unable to get it down to 1 call.

Question: I was wondering if I could actually fill the application state every 5 seconds with the 200,000-600,000 rows of data and would it actually speed up the system? Edit: do to the dynamic rows that a user or any other user enters in, the content needs to be refreshed often.

Question 2: How much can I actually hold in application cache and still get away with it going faster?

Edit: With over 20,000 users accessing these 200,000 rows, I would need to cache all of them or at least I think for best practices. When the user comes to my site, this is one of the main pages they look at and probably come back to 2-5 times per visit.

Edit: The user does see a unique set of 20 rows that could be different than any other 20 rows the users see. It is a VERY Dynamic site which a couple of different rows can get updated around once a second.

Edit: If stored in session state, then it will only speed up the amount of times a person views the page. Not the over all application because a person could only view the page once and then leave.

A: 

You say they mostly look at one table, and that table is pulled from 200 to 600K rows. How often is that table pulled? Is this a "homepage" type scenario where users are mostly looking at the first page of data? Why cache all 200K rows, why not cache the first 20?

WaldenL
A: 

Are you sure you want to store that in Session state? I'd prefer Application state if they use the same database, this way just one dataset will be stored in memory.

I think the memory limit is controlled by IIS. There are Maximum virtual memory and Maximum used memory limits. Don't forget to check availability of data.

Check this: Configuring ASP.NET Applications in Worker Process Isolation Mode (IIS 6.0)

artur02
A: 

Can you clarify for me - are you saying a user gets a datatable of 20 records that is unique to that user and is the result of querying the 600K table? Are the records static for the user?

If there are only 20 records that remain static once they are associated with the user, can you create serialized objects that can streamed to the user on request? That is, put them in a state where they are ready to go so you don't have to hit the DB.

David Robbins
updated and edited. please see changes. The 20 rows do not remain static and could/are changed every couple of minutes/hours.
Scott
+2  A: 

Technically, I believe what you want to do is possible, but I wouldn't reccomend it. There are several factors you have to consider before going down this path.

  1. Do you have the hardware to support it? If you don't have the memory for such a configuration and you have to page swap, then you'll probably lose most of speed benefit of having it cached in memory. If you are using an out of process state server, then the system has the overhead of dealing with serialization.

  2. How do you plan on searching for something in that many rows? A database server handles alot of searching and sorting for you behind the scenes. There's some pretty complex algorithms that they use which you're going to lose if you cache the data on the webserver.

There is no real hard an fast rule as to when something is faster in the database as to opposed to in memory. It really depends on how the application is setup and how the data is stored.

Kevin