First of all I am not in any way unhappy with the performance of my Django powered site, its not getting massive traffic, a bit over 1000 visits per day so far.
I was curious how well it would cope with heavy traffic peaks so I used the ab-tool to do some benchmarking.
I noticed that the performance when the concurrency is larger than 1 delivers the same ammount of request as 1 concurrent connection.
Shouldn't the reqs/s increase with concurrency?
Im on a virtual machine with 1 GB of RAM, apache2 (prefork), mod_wsgi, memcached and mysql.
All content on the page has been cached, database does not take any hits. And if memcached would drop the entry, theres only 2 light (indexed) queries - and should immediately be re-cached.
Benchmarking data: (note: i did benchmark it with 2000 and 10k requests with the same results)
For the startpage, served through apache2/mod_wsgi by django:
-n100 -c4: http://dpaste.com/97999/ (58.2 reqs/s)
-n100 -c1: http://dpaste.com/97998/ (57.7 reqs/s)
For robots.txt, directly from apache2:
-n100 -c4: http://dpaste.com/97992/ (4917 reqs/s)
-n100 -c1: http://dpaste.com/97991/ (1412 reqs/s)
This is my apache conf: http://dpaste.com/97995/
Edit: Added more information
wsgi.conf: http://dpaste.com/98461/
mysite.conf: http://dpaste.com/98462/
My wsgi-handler:
import os, sys
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings'
import django.core.handlers.wsgi
application = django.core.handlers.wsgi.WSGIHandler()