views:

104

answers:

3

Hello, I'm writing a web site (C#, ASP 3.5) while implementing a simple CMS. In several limited places - I allow the site admin to manage the page content.

Editing the content is done by an Edit control - the output HTML is stored in the database (SqlServer Express).

Each time a page loads - I read the page HTML content from the db and print it on screen.

As for now (development stage) everything works fine and smoothly.

But, I'm a bit afraid of performance issues in the real world (too many db calls ?).

I'm looking for the optimal solution for caching the dynamic HTML pages: Assuming the page content will NOT be update frequently, Should I :

  • Keep the solution as done today - call the db for the dynamic page content on each page load (if !IsPostBack)
  • Store the page content on file and read it from disk
  • Store the page content on Application variables.
  • Other idea ?

I assume that the page content, i.e. the HTML text - is not too long.

Thanks

A: 
Michel van Engelen
A: 

Use the OutputCache attribute on your pages.

OutputCache on MSDN

Super easy to do, have it cached for say 60 minutes and then there will only be a maximum of 24 DB calls per page in a given day.

Watch this: How to use Output Cache video

Slee
A: 

I don't want to post a negative response or not provide an answer but before doing anything or considering caching you should decide/work out if this is a performance issue. If you read anything from people who have been heavily involved in performance (Rico Mariani and his blog are a good example) then you will see one of the ways this is achieved through measuring.

How many hits would you be expecting per hour or minute? How long does the DB call take on the production server?

There are a number of ways of finding this out, a simple code profiler (.NET custom class - message me if you want details) can be used to time each request and output this to a log file. The time taken to make the database call can also be logged out. This can give you an idea of performance for a single request (assuming the staging hardware / environment is very similar to the production environment). To test it under load you can use a stress tests tool and simulate the expected load. That should tell you your average request serving time per second. If that is acceptable you might not require any caching at all. A lot of the time caching can be adopted before it is necessary. Adding code to use the enterprise library caching features will increase the complexity of your code unnecessarily. Output caching can also introduce subtle issues but in this situation I would agree with the answer that suggests this as a viable solution.

From what you have described it doesn't sound like you would be making too many database hits, you would be surprised at the amount of hits per requests some sites make. As for your second point I would definitely not even think about re-architecting the application for performance until measurements were taken. I assume you are storing the content in the database for good reasons, moving it onto disk carries other overheads and security implications.

Ian Gibson
In such a case when there would be only one or two database calls to retrieve the CMS texts per page, I do agree that implementing caching mechanism not necessary.
Ranch