What is the preferred method of keeping a server farm synchronized? It's currently a pain to have to upload to multiple servers. Looking for a balance of ease of use and cost. I read somewhere that a DFS can do it, but that's something that requires the servers to run on a domain. Are there any performance issues with using a DFS?
A:
We use SVN to retain the server files in specific repositories and then have a script that executes to pull the latest files out of SVN onto each of the servers in the webfarm (6 servers). This employs the TortoiseSVN utility as it has an easier command line interface for the admins and updates all the machines from a single server, usually the lowest IP address in the pool.
We ensure no server has any local modifications for the checked out repository to avoid conflicts and we get a change log with the file histories in SVN with the benefits of roll back too. We also include any admin scripts so these get the benefit of versioning and change logs.
Dave Anderson
2009-08-03 17:50:26
That's not quite what we're looking for. Some of the things that need to sycn aren't things that are really candidates for for SVN. We're looking for something that can do content files such as user uploaded images too.
Darthg8r
2009-08-03 18:03:37
I see, yes we use a SAN store (NETAPP) for all the CMS generated material and have all the servers using the same mapping to that area. We also have a specific domain just for referencing content items but that is more a legacy system with the content teams having FTP access and refercing resources by URL.
Dave Anderson
2009-08-03 18:51:58