views:

235

answers:

6

We need to regularly synchronize many dozens of binary files (project executables and DLLs) between many developers at several different locations, so that every developer has an up to date environment to build and test at. Due to nature of the project, updates must be done often and on-demand (overnight updates are not sufficient). This is not pretty, but we are stuck with it for a time.

We settled on using a regular version (source) control system: put everything into it as binary files, get-latest before testing and check-in updated DLL after testing.

It works fine, but a version control client has a lot of features which don't make sense for us and people occasionally get confused.

Are there any tools better suited for the task? Or may be a completely different approach?

Update: I need to clarify that it's not a tightly integrated project - more like extensible system with a heap of "plugins", including thrid-party ones. We need to make sure those modules-plugins works nicely with recent versions of each other and the core. Centralised build as was suggested was considered initially, but it's not an option.

+1  A: 

You should look into continuous integration and having some kind of centralised build process. I can only imagine the kind of hell you're going through with your current approach.

Obviously that doesn't help with the keeping your local files in sync, but I think you have bigger problems with your process.

Shaun Austin
The process is awfull, agreed, but there are important legacy and practical reasons for it. Total centralization is not possible - basically, you don't want to build your core together with some flimsy interface plugin.
ima
+4  A: 

I'd probably take a look at rsync.

Just create a .CMD file that contains the call to rsync with all the correct parameters and let people call that. rsync is very smart in deciding what part of files need to be transferred, so it'll be very fast even when large files are involved.

What rsync doesn't do though is conflict resolution (or even detection), but in the scenario you described it's more like reading from a central place which is what rsync is designed to handle.

pilif
Thanks, it's a possible solution. But I better like even what we have now. I'm actually impressed by performance of our version control (SourceGear) with huge binary project.
ima
A: 

Building the project should be a centralized process in order to allow for better control soon your solution will be caos in the long run. Anyway here is what I'd do.

  • Create the usual repositories for source files, resources, documentation, etc for each project.
  • Create a repository for resources. There will be the latest binary versions for each project as well as any required resources, files, etc. Keep a good folder structure for each project so developers can "reference" the files directly.
  • Create a repository for final buidls which will hold the actual stable release. This will get the stable files, done in an automatic way (if possible) from the checked in sources. This will hold the real product, the real version for integration testing and so on.

While far from being perfect you'll be able to define well established protocols. Check in your latest dll here, generate the "real" versión from latest source here.

Jorge Córdoba
I edited question to explain why it's not centralized - otherwise what you described is exactly what we do now. We are not happy with how source control tools handle binary files and deployment scenarious though.
ima
A: 

What about embedding a 'what' string in the executables and libraries. Then you can synchronise the desired list of versions with a manifest.

We tend to use CVS id strings as a part of the what string.

const char cvsid[] = "@(#)INETOPS_filter_ip_$Revision: 1.9 $";

Entering the command

what filter_ip | grep INETOPS

returns

INETOPS_filter_ip_$Revision: 1.9 $

We do this for all deliverables so we can see if the versions in a bundle of libraries and executables match the list in a associated manifest.

HTH.

cheers,

Rob

Rob Wells
A: 

Subversion handles binary files really well, is pretty fast, and scriptable. VisualSVN and TortoiseSVN make dealing with Subversion very easy too.

You could set up a folder that's checked out from Subversion with all your binary files (that all developers can push and update to) then just type "svn update" at the command line, or use TortoiseSVN: right click on the folder, click "SVN Update" and it'll update all the files and tell you what's changed.

Garo Yeriazarian
That's what we do now (with a different version control system, but it's better than SVN in that aspect), as described in question.
ima
+3  A: 

Another option is unison

Looks like, we'll give it a try.
ima