Looks like we are going to have to start load balancing our webservers here soon.
We have a feature request to edit robots.txt dynamically which is not a problem for one host -- however once we get our load balancer up and going -- it sounds like I will have to scp the file over to the other host(s).
This sounds extremely 'bad'. How would you handle this situation?
I already let the client edit the meta tag 'robots' which (imo) should effectively do the same thing as he wants from the robots.txt editing but I really don't know that much about SEO.
Maybe there is a completely different way of handling this?
UPDATE
looks like we will store it in s3 for now and memcache it frontside...
HOW WE ARE DOING IT NOW
so we are using merb..I mapped a route to our robots.txt like so:
match('/robots.txt').to(:controller => 'welcome', :action => 'robots')
then that relevant code looks like this:
def robots
@cache = MMCACHE.clone
begin
robot = @cache.get("/robots/robots.txt")
rescue
robot = S3.get('robots', "robots.txt")
@cache.set("/robots/robots.txt", robot, 0)
end
@cache.quit
return robot
end