views:

112

answers:

1

Let's assume our average page weights P KBytes and we've got N visits per unit time (let's say, per hour). How could we estimate the number of servers needed to support this load with a reasonable response time?

+4  A: 

There are so many factors to server performance (CPU, RAM, I/O, cache, local or external database, networking, etc) that you can't even guesstimate the amount of servers needed for a given project.

I find anybody who tries to figure this out is going down tho wrong path, because they often have some unrealistic idea that the site they are going to make it going to be insanely popular.

My advice, just build your app and use 1 server, assume you might need a server farm, so build your software according (which in .NET pretty much means you don't do anything different). Only buy more servers when you have an actual need, buying/renting servers in anticipation will always lead to wasted money, you're not Amazon nor Google, you can't accurately assume future expansion until you have current figures.

I think you'll find 1 good server is surprisingly fast and can handle a lot of users. Keep in mind StackOverflow (last I saw) only runs on 2 servers; a web server and a database server... that's it.

TravisO
With .NET use StateServer or SQL session management. Otherwise you won't catch state serialization errors until you need to share it across machines. You can go back to InProc for release. Also use a webgarden on your test machine (simulate multiple servers).
Robert Wagner
+1 - there are too many factors involved to make any sort of educated estimation. Wick mentioned page size and visit frequency, but what does the server-side logic do? There are just too many things that factor in. You gotta build and then test.
Kon