views:

84

answers:

4

I developed a small nifty webserver in C and would like to evaluate its performance. For this I'm doing the following:

Measuring the socket establishment time, file transfer time (for files of random sizes) and socket teardown time in the following scenarios:

  • Single-Threaded
  • Multi-Threaded

And this should give me the throughput/bandwidth... I was planning on setting this up on a bunch of computers and measuring everything... For the client part, I'm using PHP and am making use of simple timing functions in the following manner:

<?php
$time_start = microtime_float();
// COMMAND TO PROFILE
$time_end = microtime_float();
$time = $time_end - $time_start;
echo "Task took $time seconds\n";
?>

Are there any other metrics that I should measure that would give me some valuable insights?

+2  A: 

Hmm, I'm not sure it's the best approach to benchmark request performance. Take a look at ab which is provided by the Apache distribution, it's a rudimentary tool, but you should be able to run it on the same server and get a more accurate benchmark for request time. It'll also give you a bunch of other metrics.

http://httpd.apache.org/docs/2.0/programs/ab.html

Jon
I'm just curious.. What would be the difference if I measure these metrics from the server-side as opposed to the client-side?
Legend
You cut out network latency from the request (assuming you're testing via localhost) and get a raw measure of your web server performance. The network can really skew things. Depends on whether you want to take that into account or not I guess.
Jon
+2  A: 

if you are profiling PHP performance, you can use Xdebug (among many others). But if you are concern about webserver performance, its a different story. webservers like Apache have profiling tools developed for it as well, eg ab tool

ghostdog74
Sorry... I am testing my own webserver... I'll explore the Apache tools.. Thanks
Legend
+1  A: 

Memory usage might be good. You might want to look at how frequently certain functions are being called as well to look at what to optimize if anything.

Also, Facebook put out a tool called XHProf that might be worth a look: http://mirror.facebook.net/facebook/xhprof/doc.html. Some additional instructions on its usage is here: http://techportal.ibuildings.com/2009/12/01/profiling-with-xhprof/.

jcmoney
+1  A: 

Try using xDebug with profiling, then download and install webgrind.

You:

  1. Avoid code clutter and
  2. Gain a lot more information, memory, how many times a routine is called, how expensive a routine is and where it was called from.

I doubt you could collect such information using a PHP profiling class with such ease

There are plently of tutorials to get you started.

alt text

The Pixel Developer