views:

355

answers:

8

.NET 3.5, .NET 4.0, WPF, Silverlight, ASP.NET MVC - there's really a lot of new Microsoft technology released / on the horizon to try out these days. (The examples I gave is all Microsoft technology but this can apply to any language or platform). I am curious how this is handled in the company you work for. A few examples:

  • Do you have a CTO that determines what technology the company uses?
  • Are development teams free to choose what technology they use? For example: framework version, classic ASP.NET vs ASP.NET MVC, ADO.NET Entity Framework vs Linq2Sql or NHibernate? Or a mix of these?
  • What new technologies does the company you work for try out and why?
  • Does your company have dedicated resources (time) to try out WPF or whatever technology, just for research, or do you try things out in your spare time and try to introduce them to your company?

These are just examples to make my question clearer. To summarize, I'd like to know what this process looks likes, who is responsible, who makes the decisions. Does your company jump on the bandwagon, or is it reluctant to try new technologies? And are you comfortable with this situation?

At the company I work for, we still use .NET 2.0 (although we are now slowly switching to .NET 3.5), haven't seriously looked into ASP.NET MVC, haven't tried out WPF at all, etcetera. And, some find it pretty hard to convince people to do. Is it fair to expect otherwise?

+3  A: 

Since I work in such a small company and am I typically either the only developer, or the lead developer in a very small group, I can usually convince my boss to use whatever I think would be the best for a given project/situation.

TheTXI
+5  A: 

At my company, we have an architecture group that determines which technologies are used. People are welcome to read up on alternative technologies and make suggestions, but at the end of the day, it's the architecture group that makes the decisions.

While this may seem restrictive, it does ensure that all of the development groups are using the same or similar technologies, and moving from one group to the next is fairly easy. As well, by having one group do all the research, you ensure that you don't waste time by having multiple groups duplicate the research effort.

Elie
Though it is hard to accept a single answer, I feel this is the most complete answer of them all. Thanks all.
Razzie
+1  A: 

Where I work there is an architect team which looks at technologies from a high level and makes recommendations to various actual teams. A subset of the architect team actually takes the technologies and experiments on them and out of the produces

  • Internal 1 hour overview sessions
  • Week long boot camps
  • Whitepapers/Posters

The more important the technology is the more of that list is produced. All of that just feeds to teams, which combined with customer requirements for technology actually make the decision for what that team should use.

Robert MacLean
A: 

I think any company that tries new technology for the sake of it, as its bleeding edge and 'innovative' is crazy. To have a formal 'lets play with new technology to try it out department' is just nuts.... unless they're in the business of providing technology consulting to other businesses.

For everyone else technology is there to help the business get things done. Not to help developers line their CV's with cool sounding TLA's.

The company I'm working at the moment is quite large and has a CTO that chooses 'strategic platforms'. But I've have to say, if you can pick a technology, they're probably using it. They're too big to beat everyone down with the corporate stick, but they try. If the technology will work in the project and bring it in on time, then it gets used.

Nick Kavadias
+2  A: 

We stick to what we know for our major and key projects within the company.

For any new "mini" projects that come along, we take the hit on the learning curve to try and build them in the latest technologies if at all possible.

This enables us to get up to speed on these things to then comfortably and safely use these technologies in our major projects as we see fit.

Robin Day
A: 

We need solid and proven platforms for our stuff. And, we don't need anything fancy. Therefore we might go for .NET after 5-10 years or so, hope it's ready by then. On the other hand, Java is already mature enough, so we're using it alongside with C++ and some Jython scripting. These decisions are pretty much autonomous (we're a small shop).

I don't mean to mock bleeding edge developers, but whether you need solidity or newest features obviously depends on what you're working on. Many scientists are still happily using Fortran 77.

Joonas Pulakka
+1  A: 

I have a mix answer to this question. Where I work, lower level technical managers are usually the ones that chose a certain technology and sometimes even the developers have the freedom to try something new. For example, I really wanted to learn about JavaScript's Prototype while working on a web site. I made the case to my boss, he was reluctant first because nobody else knew it or had used it before, but gave me the go ahead. It was great for me to be able to learn Prototype and take advantage of it's many built in functionality. Other bigger projects come down from higher management and we don't really have much of a choice. Right now, my company is adopting SAP, so everything is moving into that direction. I don't necessarily want to become an SAP expert, but if I want to stay here, I'll need to at least learn how to work with it.

AlexFreitas
My condolences (about SAP, not Prototype).
David Thornley
+1  A: 

Every company has its own pace for innovation, and it's dependent first on the comfort level of the managers, and second on whether anybody actually does the work to research and propose using new things. When the managers start getting uncomfortable, innovation slows or stops until they get comfortable again. Some innovations they will never be comfortable with.

Keeping this in mind, I'm not sure how to answer your question about whether or not it's fair to expect more innovation than is happening. Certainly it's reasonable for you to want more; equally, once you've hit your organization's speed limit on innovation, it's not likely to change and, if it does change, it will probably take a long, long time.

I've been given rather large amounts of freedom to change things by various managers in my past, and I took advantage of it. I also ran into the limits on a regular basis, and finally dealt with my frustration by starting my own company. (This may be considered a somewhat drastic measure; certainly by doing do you reduce the time you have to research and develop the very things for which you started your company.)

These days I'm developing rather significant applications in Haskell, and I'm pleased as punch. After a year, I'm starting to get the hang of it, and I certainly have several more years ahead of me just learning what I can do with the tools I have now.

I suppose the summary of my response is: if you want to innovate more than those around you, you need to change your peer group.

Curt Sampson
Thanks for you insightful comment. Actually, the question 'is it fair to expect otherwise' was merely a by-question. Things are starting to roll here btw, I'm doing the latests project using .NET 3.5, MVC, and Linq as ORM, so I'm pretty happy :) I was mainly curious how other companies handle the constant innovation in the software world.
Razzie
Well, we handle it by a) developing our skills at learning, so that we can learn more quickly, and b) acquiring enough general knowledge that we can see whether the new tool/framework/whatever of the moment actually offers enough advantage to make it worth the time to learn and use.
Curt Sampson