views:

105

answers:

1

Hi folks,

I'm using CherryPy in order to serve a python application through WSGI.

I tried benchmarking it, but it seems as if CherryPy can only handle exactly 10 req/sec. No matter what I do.

Built a simple app with a 3 second pause, in order to accurately determine what is going on... and I can confirm that the 10 req/sec has nothing to do with the resources used by the python script.

__

Any ideas?

+3  A: 

By default, CherryPy's builtin HTTP server will use a thread pool with 10 threads. If you are still using the defaults, you could try increasing this in your config file.

[global]
server.thread_pool = 30

http://www.cherrypy.org/wiki/ServerAPI

diatoid
@diatoid: thank you very much! Btw I thought that cherrypy was built in order to support the highest amount of req/sec possible.
RadiantHex
CherryPy tries to set sane defaults, which produce around 1200 req/sec max on my laptop. But then, those benchmark requests don't take 3 seconds each. The reality for your site should be somewhere in the middle; if your real requests take 3 seconds each you're probably doing something wrong ;)
fumanchu
requests take 3 seconds because they are waiting for information to be collected from elsewhere. While waiting I am not using any reasources what so ever! So why do I have to leave my machine idle while I can serve more requests at the same time?
RadiantHex
You are using resources; for any webapp, you're keeping the socket open, which uses a file descriptor and an ephemeral port. In CherryPy, each child connection is bound to a thread for its lifetime, so you're also using one of the worker threads (and its 1MB stack size and any heap objects needed to handle each request).So you have a choice: either 1) increase the number of threads, 2) redesign your app to return 202 Accepted and poll for the answer (which would free up the socket, too), or 3) use an async webserver (and then fight latency issues instead of throughput ones).
fumanchu