For server based software that needs to be stable, hitting "every release" isn't necessary a good thing. The only benefits you get from a new version are the new features (which if you don't need them, are not a concern) and finding all the incompatibilities now that might bite in the next release as well (on top of those included in the next release).
For this reason, we still support SQL 2000 on our primary product. We have ported and tested it against 2005 and 2008... but we are not using those new features. Too many clients are still running 2000. We are finally looking to cut support for 2000 when 2010 comes out, as 10 years seems a reasonable period, so our newest (not generally released, but in use with some clients) version uses some 2005 features.
As far as our development environment goes, we did move to 2005 and 2008 about a year after each release (when the first service packs were out). That is because the client isn't on the treadmill there, so we are more aggressive. The features in 2005 and 2008 were also compelling (I don't use Linq to SQL, but I love Linq to Objects ). We also do a lot of prototyping on newer versions of software, and keep our internal projects on newer software to keep up with the technologies features for planning and learning.
As far as becoming an expert, I think that with the scope of the technologies in question, nobody is an expert in the entire product. If you know all about the query optimization engine and how to wring the last bit of performance out of it, you are less likely to have spent a lot of time on the replication engine. Personally, I think you should sample everything, but at the end of the day you have to get to work: and your work will rarely require you to be an expert at everything. Just knowing that the features are there are enough so that the day you need them... you can quickly acquire a new skill and move on.