I see a lot of benchmarks between PHP, Python, Ruby, etc all over the Internet. Ruby has gotten a lot of flak for being super slow, which leads to developers refusing to use it for web development for "performance reasons". But does the performance of the interpreter really matter for web applications? Isn't the bottleneck located in the database 99% of the time anyway? So why is everyone freaking out?
Note: I realize that in some edge cases, such as certain mathematical/scientific web applications, performance matters a lot, but I'm not talking about those; I'm talking about your average social networks, Stack Overflows, etc.