Considering that I believe that the best way to effectively gain experience with a given technology is to work, how do you make the decision between "I'm implementing our next project in x because we need to" and "I'm implementing our next project in x because I want to/think it's awesome"?
I get that it's about using the 'best tool for the job' but all you have to do is look around at all of the fanatical debates about what the 'best tool' is. Given that I can develop my next work project using [example: .NET 3.5] without a learning curve or I could implement it using [example: .NET 4] which WILL have a learning curve plus any potential deployment issues (deployment dependencies for eg) that would have to be worked out as well.^1
I can see the argument for learning in your spare/free time, and I do that because it's fun, but it's not nearly the same as being forced (or forcing yourself) to learn a technology for a project, that you're getting paid for, which has a deadline.
1> I work in the MS Stack, so the examples could just have easily been WinForms/WPF, WebServices/WCF, EntLib/NHibernate or even paradigms like Waterfall/Scrum etc. I hope you get the picture.