I've been developing a web-site that uses Django and MySQL; what I want to know is how many HTTP requests my server can handle serving certain pages.
I have been using siege but I'm not sure if that's a good benchmarking tool.
I've been developing a web-site that uses Django and MySQL; what I want to know is how many HTTP requests my server can handle serving certain pages.
I have been using siege but I'm not sure if that's a good benchmarking tool.
ab, the Apache HTTP server benchmarking tool. Many options. An example of use with ten concurrent requests:
% ab -n 20 -c 10 http://www.bortzmeyer.org/
...
Benchmarking www.bortzmeyer.org (be patient).....done
Server Software: Apache/2.2.9
Server Hostname: www.bortzmeyer.org
Server Port: 80
Document Path: /
Document Length: 208025 bytes
Concurrency Level: 10
Time taken for tests: 9.535 seconds
Complete requests: 20
Failed requests: 0
Write errors: 0
Total transferred: 4557691 bytes
HTML transferred: 4551113 bytes
Requests per second: 2.10 [#/sec] (mean)
Time per request: 4767.540 [ms] (mean)
Time per request: 476.754 [ms] (mean, across all concurrent requests)
Transfer rate: 466.79 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 22 107 254.6 24 854
Processing: 996 3301 1837.9 3236 8139
Waiting: 23 25 1.3 25 27
Total: 1018 3408 1795.9 3269 8164
Percentage of the requests served within a certain time (ms)
50% 3269
66% 4219
...
(In that case, network latency was the main slowness factor.)
ab reports itself in the User-Agent field so, in the log of the HTTP server, you'll see something like:
2001:660:3003:8::4:69 - - [28/Jul/2009:12:22:45 +0200] "GET / HTTP/1.0" 200 208025 "-" "ApacheBench/2.3" www.bortzmeyer.org
Grinder is pretty good. It lets you simulate coordinated load from several client machines, which is more meaningful than from a single machine.
I've used httperf and it's quite easy to use. There's a peepcode screencast on how to use it as well.