There is an underlying process question here that I think shouldn't be overlooked:
When is the proper time to upgrade development tools and production environments?
On the one hand you could skip 2008 though this leads to the question of when would 2010 be adopted: Upon first release, first service pack release, or some other milestone? This may lead to creating more legacy code if you stay locked in on 2005 using the 2.0 framework and others move onto other frameworks. Even if you switch to 2008, it can still target the 2.0 framework so that that upgrade of the .Net framework may happen separately which some may like. Another key point in this camp is who does the research to evaluate the differences between versions to see which is worth the shift.
On the other, you could suggest that there be a continuous strategy of preparing to upgrade every 3 years or so as the Visual Studio releases of the past decade were roughly 2002, 2003, 2005 and 2008 so far. This would seem to me to be the better approach as there is more of a constant evolution going on rather than staying locked in at all. In this case there may be new features that get used since the new tools come quickly compared to the first case where the shift may be viewed as a large step whereas in this case it isn't that big since you are always looking to move in 2-3 years.
Course as I say this my old work machine has Visual Studio 2003, 2005 and 2008, so I am kind of in that latter camp which makes sense to me. I remember 10 years ago my work machine had NT 4.0, Pentium II 333 MHz processor, 64 MB of RAM and a 4 GB hard drive that had to be 2 partitions as it wouldn't let one partition be that big. Now my work machine has 4 GB of RAM alone, a 2.66 GHz dual core processor and a 160 GB hard drive. Could I in another 10 years have a machine with hundreds of GBs of RAM? While that may seem ridiculous, if I were sharing a machine with a handful of other developers, it may make sense to divide up a huge amount of memory amongst us all.