views:

71

answers:

1

Could someone pointing out what is the best way to control the website downtime; especially when having many sites running?

A: 

Check out Todd Hoff's highscalability.com for many resources and case studies on how web sites are architected for high availability and scalability. I've also found Theo Schlossnagle's Scalable Internet Architectures and Cal Henderson's Building Scalable Web Sites to be very helpful.

Some themes include

  • architecting your application to be sessionless (ie, any request can be handled by any one of your web servers),
  • doing careful front-end design of your web pages,
  • database replication and failover,
  • load balancing across your web servers,
  • leveraging content distribution networks like Akamai,
  • putting big HTTP caches in front of your web servers,
  • using in-memory caches like memcached,
  • n+m redundancy,
  • serving static content separately from specialized servers (like nginx or lighttp), and
  • serving static content out of Amazon's S3 Simple Storage Service.

This stuff is really fascinating ...

Jim Ferrans