views:

799

answers:

4

I need to test a web form that takes a file upload. The filesize in each upload will be about 10 MB. I want to test if the server can handle over 100 simultaneous uploads, and still remain responsive for the rest of the site.

Repeated form submissions from our office will be limited by our local DSL line. The server is offsite with higher bandwidth.

Answers based on experience would be great, but any suggestions are welcome.

A: 

I would perhaps guide you towards using cURL and submitting just random stuff (like, read 10MB out of /dev/urandom and encode it into base32), through a POST-request and manually fabricate the body to be a file upload (it's not rocket science).

Fork that script 100 times, perhaps over a few servers. Just make sure that sysadmins don't think you are doing a DDoS, or something :)

Unfortunately, this answer remains a bit vague, but hopefully it helps you by nudging you in the right track.

Continued as per Liam's comment:
If the server receiving the uploads is not in the same LAN as the clients connecting to it, it would be better to get as remote nodes as possible for stress testing, if only to simulate behavior as authentic as possible. But if you don't have access to computers outside the local LAN, the local LAN is always better than nothing.

Stress testing from inside the same hardware would be not a good idea, as you would do double load on the server: Figuring out the random data, packing it, sending it through the TCP/IP stack (although probably not over Ethernet), and only then can the server do its magic. If the sending part is outsourced, you get double (taken with an arbitrary sized grain of salt) performance by the receiving end.

Henrik Paul
I guess you would do this on the server itself or some machine on the same LAN?
Liam
A: 

Automate Selenium RC using your favorite language. Start 100 Threads of Selenium,each typing a path of the file in the input and clicking submit.

You could generate 100 sequentially named files to make looping over them easyily, or just use the same file over and over again

Midhat
+1  A: 

Use the ab (ApacheBench) command-line tool that is bundled with Apache (I have just discovered this great little tool). Unlike cURL or wget, ApacheBench was designed for performing stress tests on web servers (any type of web server!). It generates plenty statistics too. The following command will send a HTTP POST request including the file test.jpg to http://localhost/ 100 times, with up to 4 concurrent requests.

ab -n 100 -c 4 -p test.jpg http://localhost/

It produces output like this:

Server Software:        
Server Hostname:        localhost
Server Port:            80

Document Path:          /
Document Length:        0 bytes

Concurrency Level:      4
Time taken for tests:   0.78125 seconds
Complete requests:      100
Failed requests:        0
Write errors:           0
Non-2xx responses:      100
Total transferred:      2600 bytes
HTML transferred:       0 bytes
Requests per second:    1280.00 [#/sec] (mean)
Time per request:       3.125 [ms] (mean)
Time per request:       0.781 [ms] (mean, across all concurrent requests)
Transfer rate:          25.60 [Kbytes/sec] received

Connection Times (ms)
          min  mean[+/-sd] median   max
Connect:        0    0   2.6      0      15
Processing:     0    2   5.5      0      15
Waiting:        0    1   4.8      0      15
Total:          0    2   6.0      0      15

Percentage of the requests served within a certain time (ms)
  50%      0
  66%      0
  75%      0
  80%      0
  90%     15
  95%     15
  98%     15
  99%     15
 100%     15 (longest request)
Liam
Very cool, thanks for sharing.
John McCollum
A: 

check out tools like OpenSTA, JMeter, Pylot

Corey Goldberg