tags:

views:

119

answers:

2

I think this question has not to do with programming in general, but nevertheless the answers might be interesting to other web developers.

I just wondered how to estimate the minimum requirements to have a fast website. Obviously there are some facts that have to be considered like the expected number of visitors, the derived number of clicks per seconds and so on... Also running services like web servers (Apache/lighttpd) or mail servers (Exim, sendmail, ...) could end up in different needs.

Maybe you know a good website or can give some explanations on how to estimate the needed server configuration from such information?

+3  A: 

You have already mentioned about number of users, Servers etc. Here are some more to consider.

  1. Clustered server if the traffic is high
  2. Physical location of the server : Find the the target of audience for your website and better have the server in that country.
  3. Disaster Recovery Plan. Having a faster website is good and having a faster recovery process is also very good.
  4. Choose the best technology and implement new technologies like AJAX and reduce server request where ever possible.

Will add more if something comes up.

Shoban
+4  A: 

This is arguably more art than science.

What you have to remember is that like many things in programming and IT, your website will be as slow as the slowest link in the chain, meaning you will have some bottleneck such as bandwidth, the Web servers, disk I/O, memory, your databases, your firewall, etc that will limit the speed of your Website.

Tuning and growing your Web site will involve identifying those issues as you grow and addressing them. At one point you may need to add more RAM, at another you may need another CPU and so on. At other times adding more memory might be useless because memory isn't your problem.

Likewise, lack of a certain resource can be masked, like lack of memory can be masked by intensive disk I/O as your system swaps (page faults) constantly but disk I/O isn't the problem.

So what do you do?

The first thing is you need to identify (or make a reasonable guess) as to what a typical user will do and how much they will do it. Ideally you will be able to model 100 or 1000 or however many users you need with software like JMeter to then get an idea of how your Website scales, how much bandwidth is going to be required and so on. By modelling 100, 500, 1000, 2000 users you will hopefully be able to see how linearly you Web site scales.

You may find that supporting 1000 users requires 1 gig of RAM but 2000 requires 4 gigs: that's an example of non-linear scalability that expoes a problem you will have scaling your Web site up. And that's the kind of thing to be revealed by performance testing.

Honestly though, hardware is so cheap these days that it's rarely a problem except for the biggest and most popular of sites ($10k can buy you 1 or even 2 servers with 16G of RAM and 4-8 cores each). Shared and VPS hosting are a different story because you'll typically only want to pay for however much memory, bandwidth and disk space you need. Luckily those kinds of solutions tend to alow you to upgrade pretty easily (at least to a point where you'll eventually have to go dedicating hosting).

You can make some dirty estimates at the beginning of a project by doing what they call "back of the envelope" estimation. Run key queries say 100 times and work out hwo much CPU time they require, hit a mocked up page 100 times and work out how much bandwidth it generates and so on. These rough estimates combined with guesses about how users will use the site will give you a ballpark (hopefully within a factor of 2-3) of what you'll need.

cletus