views:

557

answers:

5

In order to test my program, I want to send about 50 http requests as fast as possile. Currently I'm using bash script which calls curl & to send each request like

curl "http://my.host.com/test.php?param=1" &
curl "http://my.host.com/test.php?param=2" &
curl "http://my.host.com/test.php?param=100" &

but it seems not fast enough. I guess since each curl call will establish a tcp connection first, and performance can be improved if the connection is established only once. Is there any other tools to send request effciently?

+1  A: 

Don't use bash. Use any other scripting language (Perl, Python, whatever), then create a single connection and pound away on the requests as quickly as possible. For bonus points, launch multiple threads so that you test your program's ability to handle simultaneous connections!

JSBangs
+1  A: 

JMeter is a good way to load test apache.

From their site:

Apache JMeter is open source software, a 100% pure Java desktop application designed to load test functional behavior and measure performance. It was originally designed for testing Web Applications but has since expanded to other test functions.

What can I do with it?

Apache JMeter may be used to test performance both on static and dynamic resources (files, Servlets, Perl scripts, Java Objects, Data Bases and Queries, FTP Servers and more). It can be used to simulate a heavy load on a server, network or object to test its strength or to analyze overall performance under different load types. You can use it to make a graphical analysis of performance or to test your server/script/object behavior under heavy concurrent load.

gnarf
+5  A: 

Apache comes with a program called "ab" for benchmarking. On the net, there are many stress test applications for websites that you can use.

Daniil
http://httpd.apache.org/docs/2.0/programs/ab.html
gnarf
+1  A: 

The reason why this doesn't perform despite the fact that you background all the downloads is that the OS still has to load curl (from cache but still), create a process for it, parse the parameters, etc.

You need something that can send N requests in parallel without starting new processes.

Aaron Digulla
+2  A: 

In PHP (and also in CURL of course) there is something really nice.

It takes about 16 seconds to do 1000 regular GETs on google.com, about 63 requests per second.

Alix Axel