views:

201

answers:

5

I'm in the process of developing my first major project. It's a light-weight content management system.

I have developed all of my own framework for the project. I'm sure that will attract many flames, and a few 'tut-tut's, but it seems to be doing quite well so far.

I'm seeing page generation times of anywhere from 5-15 milliseconds. (An example, just in case my numbers are wrong, is 0.00997686386108 seconds).

I want to make sure that the application is as efficient as possible. While it looks good in my testing environment, I want to be sure that it will perform as well as possible in the real world.

Should I be concerned about these numbers - and thus, take the time to fine tune MySQL and my interaction with it?

Edit: Additionally, are there some tools or methods that people can recommend for saturating a system, and reporting the results?

Additional Info: My 'testing' system is a spare web hosting account that I have over at BlueHost. Thus, I would imagine that any performance I see (positive or negative) would be roughly indicative of what I would see in the 'real world'.

+3  A: 

Performing well in your testing environment is a good start, but there's other issues you'll need to think about as well (if you haven't already). Here's a couple I can think of off the top of my head:

  1. How does your app perform as data sizes increase? Usually a test environment has very little data. With lots of data, things like poorly optimized queries, missing indexes, etc. start to cause issues where they didn't before. Performance can start to degrade exponentially with respect to data size if things are not designed well.

  2. How does your app perform under load? Sometimes apps perform great with one or two users, but resource contention or concurrency issues start to pop up when lots of users get involved.

Eric Petroelje
Good answer! One thing that also plays a major role is your underlying hardware/OS and the server location. Having an idle machine at home and connecting via localhost is nothing in comparison to some shared service located somewhere...
merkuro
+2  A: 

You're doing very well at 5-15 ms. You're not going to know how it performs under load by any method other than throwing load at it, though.

chaos
+1  A: 

As mentioned in another question: What I often miss is the fact, that most websites could increase their speed enormously by optimizing their frontend, not their backend. Have a look at this superb list about speeding up your frontend @ yahoo.com:

  • Minimize HTTP Requests
  • Use a Content Delivery Network
  • Add an Expires or a Cache-Control Header
  • Gzip Components
  • Put Stylesheets at the Top
  • Put Scripts at the Bottom
  • Avoid CSS Expressions
  • Make JavaScript and CSS External
  • Reduce DNS Lookups
  • Minify JavaScript and CSS
  • Avoid Redirects
  • Remove Duplicate Scripts
  • Configure ETags
  • Make Ajax Cacheable
  • Flush the Buffer Early
  • Use GET for AJAX Requests
  • Post-load Components
  • Preload Components
  • Reduce the Number of DOM Elements
  • Split Components Across Domains
  • Minimize the Number of iframes
  • No 404s
  • Reduce Cookie Size
  • Use Cookie-free Domains for Components
  • Minimize DOM Access
  • Develop Smart Event Handlers
  • Choose < link> over @import
  • Avoid Filters
  • Optimize Images
  • Optimize CSS Sprites
  • Don't Scale Images in HTML
  • Make favicon.ico Small and Cacheable
  • Keep Components under 25K
  • Pack Components into a Multipart Document
Philippe Gerber
Yahoo's advice should be taken with a grain of salt - http://www.codinghorror.com/blog/archives/000932.html
Sean McSomething
A: 

5-15 milliseconds is totally acceptable as a page generation time. But what matters most is how well your system performs with many people accessing your content at the same time. So you need to test your system under a heavy load, and see how well it scales.

About tuning, setting up a clever cache policy is often more efficient than tuning MySQL, especially when your database and your http server are on different machines. There are very good Qs and As about cache on StackOverflow, if you need advices on that topic (I like that one, maybe because I wrote it :)

Nicolas
A: 

It depends on a few factors. The most important is how much traffic you're expecting the site to get.

If your site is going to be fairly low traffic (maybe 1,000,000 page views per day - an average of around 11 per second), it should be fine. You'll want to test this - use an HTTP benchmarking tool to run lots of requests in parallel, and see what kind of results you get.

Remember that the more parallel requests you're handling, the longer each request will take. The important numbers are how many parallel requests you can handle before the average time becomes unacceptable, and the rate at which you can handle requests.

Taking that 1,000,000 views per day example - you want to be able to handle far more than 11 requests per second. Likely at least 20, and at least 10 parallel requests.

You also want to test this with a representative dataset. There's no point benchmarking a CMS with one page, if you're expecting to have 100. Take your best estimate, double it, and test with a data set at least that large.

As long as you're not doing something stupid in your code, the single biggest improvement you can make is caching. If you make sure to set up appropriate caching headers, you can stick a reverse proxy (such as Squid) in front of your webserver. Squid will serve anything that's in it's cache directly, leaving your PHP application to handle only unique or updated page requests.

BlackAura