views:

220

answers:

3

I'm planing to deploy a django powered site. But I feel confused about the choice of web servers, which includes apache, lighttpd, nginx and others.

I've read some articles about the performance of each of these choice. But it seems no one agrees. So I'm wondering why not test the performance by myself?

I can't find information about the best approach to performance testing web servers. So my questions are:

  1. Is there any easy approach to test the performance without the production site?
  2. Or can I have a method to simulate the heavy traffic to have a fair test?
  3. How can I keep my test fair and close to production situation?

After the test, I want to figure out:

  1. Why some ones say nginx has a better performance when serving static files.
  2. The cpu and memory needs of each web server.
  3. My best choice.
+3  A: 

Tools like ab are commonly used towards testing how much load you can take from a battering of requests at once, alongside cacti/munin/your system monitoring tool or choice you can generate data on system load & requests/sec. The problem with this is many people benchmarking don't realise that they need to request a lot of different requests, as different parts of your code executes it will take varying amounts of time. Profiling and benchmarking code and not requests is also important, to which plenty of folk have already done so for django, benchrun is also not a bad tool either.

The other issue, is how many HTTP requests each page view takes. The less amount of requests, and the quicker they can be processed is the key to having websites that can sustain a high amount of traffic, as the quicker you can finish and close connections, the quicker you allocate resources for new ones.

In terms of general speed of web servers, it goes without saying that a proxy server (running reverse at your end) will always perform faster than a webserver with static content. As for Apache vs nginx in regards to your django app, it seems that mod_python is indeed faster than nginx/lighty + FastCGI but that's no surprise because CGI, regardless of any speed ups is still slow. Executing and caching code at the webserver and letting it manage it is always faster (mod_perl vs use CGI, mod_php vs CGI, etc) if you do it right.

squeeks
The problem with nginx/lighty + wsgi runs a little deeper than you might think: http://blog.dscpl.com.au/2009/05/blocking-requests-and-nginx-version-of.html
Alex Barrett
+1  A: 

You need to set up the web server + website of your choice on a machine somewhere, preferably a physical machine with similar hardware specs to the one you will eventually be deploying to.

You then need to use a load testing framework, for example The Grinder (free), to simulate many users using your site at the same time.

The load testing framework should be on separate machine(s) and you should monitor the network and CPU usage of those machines as well to make sure that the limiting factor of your testing is in fact the web server and not your load injectors.

Other than that its just about altering the content and monitoring response times, throughput, memory and CPU use etc... to see how they change depending on what web server you use and what sort of content you are hosting.

Kragen
+1  A: 

Apache JMeter is an excellent tool for stress-testing web applications. It can be used with any web server, not just Apache.

Travis Beale