There were times when we had to choose between 2 or 3 technologies/strategies to develop modules.
Now for every small or large component/module/project, we have almost uncountable options. It might be easy for those with years of experience, but not for those who are new to programming, say less than a year.
I get frustrated sometimes with the choices out there for data access in the .NET world. We cannot go and read about every tool there is in the market and what it has to offer, for each and every product.
The reason for coming up with the question is recently we had to work on a project and specs for the DataAccessLayer were finalized with ADO.NET. About 20% of the way into the project, a new developer joined our department (but not our team). I'd consider him smart, helpful, and we enjoy working with him.
During a code review, he personally advised us that it was better to use LINQ to SQL for the module we were working on. He was convincing. After a positive debate, we agreed to use LINQ to SQL.
However, the "management" was not happy about that. There argument was that we should have come up with this "fantastic idea" before starting the module. Their argument is that resources have been spent on 20% of the work so far, and that work will be wasted.
Given the pace of new products/technologies/strategies coming out frequently, we find it difficult to have all information about all these tools and technologies.
We've had success using ADO.NET. We had an idea about LINQ (in general), NHibrnate and many others, but we went ahead ADO.NET. I am not opposing learning new things, that is the reason we collectively pushed for using LINQ.
Question Are we at fault for making this choice at the time we did?
Are there any metrics or guidelines for making a decision about which to technology choose for certain situations, and when not to switch mid-stream?