views:

868

answers:

4

So I've got a situation where we have a project with 10 developers. Each developer, when they come in for the day, is randomly issued a machine to use for development that day. The machine names are different, say DEV01 - DEV10. At the time that they are issued to the developers, the machines are identical, and no changes the developers make during the day are persisted on the machines (source code changes are stored in TFS, not locally). These are of course actually virtual machines, but that's not really relevant to the point at hand.

The problem is that each morning, the developers run into 3 issues:

1) The machine that they are assigned may not be the same machine they were last assigned to. For example, DevMan A might have used DEV04 yesterday, and received DEV06 today. His workspace definitions are now tied to DEV06; he must create a new workspace, or migrate the old workspace to DEV04.

2) The machine that they are assigned may have been in use yesterday, and some of the mappings may conflict. For example, DevMan A might have DEV04 today, and wish to create a workspace mapping the project folder to "C:\MyProj\Solution". However, DevMan B had DEV04 yesterday, and he used the same project folder. TFS now complains.

3) This may be the first time they are on a given machine. They now need to recreate for this machine all of their source-control mappings for the new machine.

All of these issues can be resolved in a straightforward fashion on a case-by-case basis, but it does sap some productivity from the morning. We'd much prefer if the TFS workspace definitions could be 'relaxed', such that they did not include the machine name in the definition somehow. Barring that, if anyone is aware of a solution to the above problems that can run automatically, or with limited user intervention, that would also be ideal.

A: 

You can place the workspace on a shared network drive, so it doesn't matter what machine (virtual or otherwise) the developer is logged into. This works fine, but you will also have to configure some COM stuff once to keep it happy.

Steven Sudit
That solves problem #2, but not problem #1.
GWLlosa
IF you bind the root to (say) P:\dev\users\my.name\tfs once on each machine, you're done.
Steven Sudit
files can be on network but workspace needs to be created per PC you are working on
Perica Zivkovic
@Perica: That's correct. You would have to bind it once on each machine.
Steven Sudit
+2  A: 

1.) For every workstation you are going to work on, you need to define a workspace (remote <=> local mapping). You can store source files on the network (don't recommend this due to VS caching) but local workspace (mapping definition) needs to exist on specific PC for specific user.

2.) Create separate local folders per developer to prevent mess of different people working on same folder and getting latest on checked out code of others. For example:

c:\Projects\DevManA\...
c:\Projects\DevManB\... etc.

3.) This will probably cause lots of discussions but I recommend mapping root of your TFS to your local workarea, eg. DevManA mapping:

$/ to C:\Projects\DevManA\

Then when you are checking projects out you will get structure:

c:\Projects\DevManA\TFSProject1\..
c:\Projects\DevManA\TFSProject2\..
c:\Projects\DevManA\TFSProject3\..

etc.

This is easy and quick mapping everybody can do in 5 sec and you are ready to go. Then everybody has the same layout on disk like in TFS which also "fits the brain" more easily.

Perica Zivkovic
+1 - nice idea - if they are insistent on using lots of extra machines for their devs, may as well put the extra disk space to use. However, I think mapping $/ is too aggressive for all but the smallest installations. It's very easy to accidentally kick off a "tf get" or similar that's much much costlier (in server load, if nothing else) than you intended.
Richard Berg
@Richard thats why I said it can cause discussions :D True in a way but in practice you have situation that not all developers are working on all TFS projects (thats one filter). Even if you are working on a big project, than good practice is (again danger to raise discussions :D) to have developer branch + release branches (another filter) and in your developer branch separate solutions to keep code nicely organized (yet another filter). So after all those filters, developer works in one to two solutions per day and does not need to "get" more than that.
Perica Zivkovic
True, *developers* usually don't need to work on large subtrees, but TFS doesn't know that. By design, all TFS commands operate against the entire workspace unless you specify a path filter. If you **ever** forget to do so, the side effects can range from confusion to server overloading to actual correctness bugs. For example: while checking in a small bugfix to Branch 1, you accidentally commit a large incomplete changeset that was pending over in Branch 2 (but shared the same workspace). I've done it :(
Richard Berg
+3  A: 

First, the super-obvious answer is to dedicate machines to users.

Secondly... if you really want to solve the problem as stated:

You cannot use workspaces without assigning them to a specific machine. This assumption is implicit in the product. But you can fool it :)
Warning: This recipe seems to work but I have not personally run a project using it.

  1. Assign a "Virtual" machine name for each user, i.e UseridVM
  2. For each virtual machine the following (persistent or startup scripted) setup is needed:
    • create a new environment variable "User Variable" i.e _CLUSTER_NETWORK_NAME_=UseridVM
  3. Nice to have: Use a virtual hard drive which is dedicated to the user id and mapped (or mounted using a script) to "D:" which follows the user from VM to VM.

Now, when the user opens up Visual Studio, the workspace will use the specified value "UseridVM" as the machine name, thus the same workspace will be found on each machine.

If you don't have a persistent virtual hard drive, then each user must be sure to do a 'real' "Get Latest" (Get Specific Version, check all boxes) when starting for the day because the workspace memorizes what files have already been downloaded and will not re-download them if it feels they already exist.

Jennifer Zouak
So _CLUSTER_NETWORK_NAME_ is an environment variable recognized by TFS as a replacement to the machine name?
GWLlosa
Yes. More specifically, it is recognized by Windows OS as a local override to HOSTNAME. And TFS presumably uses HOSTNAME.
Jennifer Zouak
This seems to be working pretty well in conjunction with a script on startup to set the env variable for each user.
GWLlosa
+1  A: 

I don't know if this will meet the requirements stated in the question, but you might check this out:

http://blogs.msdn.com/granth/archive/2009/11/08/tfs2010-public-workspaces.aspx

TFS 2010 has added public / shared workspaces which can be used by multiple users on the same machine.

soccerdad