views:

269

answers:

5

Hi,

I am trying to write a script whose efficiency I am trying to measure. I have a couple of questions:-

  1. For small applications, is this kind of profiling required? Or am I getting paranoid? (assuming most code is decently efficient/no infinite loops)
  2. Against what should I benchmark this? What should I compare against?
  3. Below is the efficiency output I got from ab. Is this way too off? Am I heading in the wrong direction designing this app? Are there any warning signals I should be aware of?
abs -n10000 -c100 http://localhost/testapp

This is ApacheBench, Version 2.3 
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 1000 requests
Completed 2000 requests
Completed 3000 requests
Completed 4000 requests
Completed 5000 requests
Completed 6000 requests
Completed 7000 requests
Completed 8000 requests
Completed 9000 requests
Completed 10000 requests
Finished 10000 requests


Server Software:        Apache/2.2.10
Server Hostname:        localhost
Server Port:            80

Document Path:          /testapp
Document Length:        525 bytes

Concurrency Level:      100
Time taken for tests:   33.608 seconds
Complete requests:      10000
Failed requests:        5179
   (Connect: 0, Receive: 0, Length: 5179, Exceptions: 0)
Write errors:           0
Total transferred:      6973890 bytes
HTML transferred:       5253890 bytes
Requests per second:    297.55 [#/sec] (mean)
Time per request:       336.080 [ms] (mean)
Time per request:       3.361 [ms] (mean, across all concurrent requests)
Transfer rate:          202.64 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   1.5      0     109
Processing:     8  334 403.9    176    3556
Waiting:        7  334 403.9    176    3556
Total:          9  335 403.8    177    3556

Percentage of the requests served within a certain time (ms)
  50%    177
  66%    296
  75%    415
  80%    519
  90%    842
  95%   1141
  98%   1615
  99%   1966
 100%   3556 (longest request)

I am using PHP to write the script. On further testing, I also found that "Failed requests" becomes 0 if I comment the MySQL connection part from my PHP script. Whats wrong? How do I reduce this failure rate?

Thank you, Alec

+1  A: 

I think this looks like a great job. Way off? I'd say it's way above the norm.

One question is what the load on the server will be in production. Your script is firing requests at this server, but if you're the only user on a development instance you aren't seeing what happens when you hit the production server while it's processing the typical production load.

If that's the case, you need TWO sources of request: one that represents your new app, and another for the production processes that it'll compete with for resources.

Can you set the # of simultaneous users in the benchmark software? Does this test just send off 1000 requests, one after the other? Having multiple users banging on the server at the same time might be more realistic.

Can you make the sending interval random? That might be a better representation of your real situation.

Can you vary the data that the script uses? Does it represent the conditions under which it'll be used very well?

Other than that, all I can offer is my congratulations. Looks like you're being very thorough to me.

duffymo
A: 

~200ms per Request is a somewhat common number at which a page appears to be 'fast' for the majority of users.

Karsten
A: 

You shouldn't be getting any failed requests - you need to check your error log to see why they're failing.

It's most likely to be MySQL running out of connections, in which case you can simply tune your server to allow more concurrent connections (if you expect that amount of traffic).

Greg
+1  A: 

Try using xdebug to profile your code. xdebug will also give you better on-screen errors & stack traces.

Then use webgrind to view the profile in a nice format.

lo_fye
+2  A: 

Do you expect 100 concurrent requests? Do you expect to receive 10K requests in 30 seconds?

It's great that you can run this benchmark, but ask yourself what it means. Think about the real amount of traffic you'll be receiving. You really need a question to benchmark against:

  • I expect that my site will have 3,000 users.
  • I expect that during peak usage, 500 of them will be hitting the page
  • A typically usage is 3 requests over a minute: 3 * 500 / 60 = ~ 25 req/sec
  • Can my site handle 25 req/sec and be responsive (<200ms per request)?

Unless you're in the top few percent of the web, your page won't see 100 concurrent requests in real life. It doesn't make sense to tune your site for that level of traffic. To hit those numbers, you need to make design compromises at the architecture level (database usage, caching methods, etc: hence your number of failures when the database is on).

If you're only trying to profile your script, use xdebug to find where your code is spending it's time.

Gary Richardson