views:

63

answers:

4

Hi All,

I am running a heavy traffic site and our server is beginning to get to its limits, at the moment the entire LAMP stack is on one box (not ideal).

I would like to move the database onto it's own box or onto a cloud service, but from my previous experience moving the database off the same box as the webserver increases the latency of reads quite dramatically slowing down the site.

Is using a cloud service for this going to overcome this problem, because as far as I can tell its essentially the same situation (as moving it onto a separate box in my control)? In which case why is there so much popularity around cloud based database services at the moment?

Are cloud based database services so quick that the latency of reads is so low that its almost like having it on the same box in the same datacentre?

A: 

There will be increased latency going across the network, but it shouldn't be that noticeable. Gigabit ethernet is pretty fast. When you have tried splitting the boxes, how did you access the other box? You should be using a local, internal IP address (i.e. 192.168.#.#). If you are not, then your requests may get routed over the internet, even though the boxes are physically next to each other.

Moving to a cloud won't solve your problems if the servers aren't networked properly.

Brent Baisley
I can't remember the exact details of the network configuration im afraid was a few years back, what Im trying to establish is, is the extra latency of connecting to a cloud service so small that it makes sense to sacrifice a slight performance decrease for the stability and raw processing benefits? thats probably a better worded question than above :S thanks for your help so far
Gcoop
192.168.x.x is not a problem. If your traffic is being routed out you have a misconfigured switch or a corrupted route table.
Steve
A: 

As a rule of thumb latency will be higher further apart the two serves are. (each router from 'here' to 'there' may take an extra few milliseconds). And even though data packets travel at the speed of light, a thousand kilometers will add an extra 3 milliseconds to the latency. So in short, closer the servers are together the better you are.

If you feel the load at the moment is a bit high, have you considering using memcached or something similar to avoid repeating the same queries?

e4c5
A: 

Using a cloud service just for your database won't help your situation. If you only move the database, you're physically placing it in a remote location - which will always increase latencies, no matter how powerful the hardware serving the content.

I would suggest that you will see a benefit in hosting your database on a separate machines from your web server, so long as they are physically next to each other sharing a dedicated network (as already suggested).

If you wanted to explore the benefits of cloud services, I would suggest only doing so if you can move both database and web server together. Furthermore, it's really only of benefit if you explore load balancing across multiple web-servers and/or replicated databases. (The ability to scale dynamically is a major benefit of cloud based platforms).

Tim Kane
A: 

Clouds are about paying someone else to manage the infrastructure so you don't have to. They also come with some nice benefits about being able to rapidly acquire infrastructure since you don't have to wait for physical machines to be landed you can simply tap into the "cloud's" unused capacity. Sure people build features on top of this infrastructure to make it easier to scale (this is usually programming against a certain model).

If you are thinking about a cloud when are you planning on moving to 10 servers...or 100? Do you deal with traffic that comes in large bursts where the peaks in your traffic are very high?

Since you are talking about moving to a second box I don't think you need to have the cloud discussion yet. Just add a database server and use caching like e4c5 recommended.

Steve