Let's say I have a web application running on S servers with an average of C cores each. My application is processing an average of R requests at any instant. Assuming R is around 10 times larger than S * C, won't benefits from spreading the work of a request across multiple cores be minimal since each core is processing around 10 requests already?
If I'm correct why does this guy say concurrency is so important to the future of Python as a language for web development?
I can see many reasons why my argument would be incorrect. Perhaps the application receives a few very-difficult-to-process requests that are outnumbered by available cores. Or perhaps there is a large variation in the difficulty of requests, so it's possible for one core to be unlucky and be given 10 consecutive difficult requests, with the result being that some of them take much longer than is reasonable. Given that the guy who wrote the above essay is so much more experienced than I am, I think there's a significant chance I'm wrong about this, but I'd like to know why.