views:

1945

answers:

41

I often hear people saying you shouldn't rush into adopting new technologies until they have become stable, tried and tested. There is even a joke on how it takes 3 versions to get it right. This might be the voice of real-life experience, but at least sometimes such posture is the result of complacency, resistance to change and effort necessary to learn new skills.

In my opinion however, it is crucial for success in software industry to keep the pace with the innovation. While big companies have whole departments dedicated to R&D, in smaller companies it's the development teams that have to keep up. Embark on the new technology even before it is officially out - this will give you some head-start and will help you keep up with the rest.

Here is the strategy that I try to follow whenever possible:

  • Be aggressive in adopting new technologies
  • Use early betas for experimenting and prototypes and RCs for development
  • Address any last minute changes to the product when official release of technology you adopted early comes out
  • Do not rely on some obscure open source project with 0 activity
  • Be sure to study but take with a grain of salt official product roadmap.

So far, I never paid the price of being too zealous to jump on some new technology train, still I reaped the benefits. I wonder if this is just a coincidence or maybe being early-adopter is not so dangerous after all?

More than inviting to a discussion on the subject early-adoption, since such issue is surely to be contentious and subjective, I would like to hear real-life experiences where adopting early new technology proved to be a serious mistake and dire price had to be paid.

+8  A: 

Yes I have! With JSF 1.0! It seemed like Sun didn't reviewed it well before releasing it out.

We've been trying to make things work but in a while we just discovered that our errors were caused by JSF bugs and we had to use workarounds. It was not until JSF 1.1 and the use of myfaces-tomahawk implementations that the project started having some speed.

victor hugo
Well, if you are talking about JSF I have also made bad experiences. That's the power of open source, everyone is trying to implement the JSF specification, nobody can do it right and they all are incompatible to each other.
User
Yes, I guess we could start a community wiki regarding JSF bad experiences
victor hugo
Curiously, I also evaluated JSF at one point early on but decided against it.
Danijel Arsenovski
+8  A: 

When I was 10, my father tried to play a New Year song for me on a brand new Elektronika BK-0010-01.

Needless to say that the synthesizer failed to load from the tape and there was no song until the neighbour came with a guitar.

Quassnoi
Why, BK is a computer and you had to use BASIC to load the synthesizer :)
Quassnoi
when you were [Code]10[/Code]? Does that mean 2 in base 10? :P
cwap
+3  A: 

In my opinion however, it is crucial for success in software industry to keep the pace with the innovation.

This doesn't answer your specific question, but, there's a book called Crossing the Chasm that might interest you.

ChrisW
Thanks Chris. I guess this puts me under “enthusiasts and visionaries”. It would be interesting to see how this applies to software industry, probably the adoption curve is much shorter than in some other industries.
Danijel Arsenovski
+2  A: 

I could count many of them. The one that still hurts when I think about it is WLPI (an old BEA workflow product). Never worked out and the vendor abandoned it. Sigh ...

Anyway, I would say keeping up with the latest (knowing what is there, considering it) is very worthwhile, but only live on the cutting edge if:

  1. You are prepared to get cut and bleed (money/time/resources)
  2. It provides an important strategic advantage/competitiveness.

A good example for this is AJAX. It is now mature enough that every new website should be doing it unless they have a compelling reason not to, but when it was first becoming possible, a website built on it would have been very expensive compared to the traditional alternative.

Some websites need the latest look and feel to stay competitive, even to the point where the features of the site themselves are secondary and they needed to be AJAX early adapters. Others do not. Know who which one you are and act accordingly.

Yishai
+17  A: 

Several years ago, we made heavy use of the new SQL Server 2005 feature called Notification Services. To our dismay, that has been discontinued in SQL Server 2008. This was a serious problem, caused the software architect to question all new Microsoft technologies.

Here's some detail and some more and some more

There have also been issues with Microsoft's Entity Framework.

DOK
"This was a serious problem, caused the software architect to question all new Microsoft technologies." We might start a community wiki on this too
victor hugo
Do not use Microsoft RAD technologies if you have a project that is supposed to be long-living. They just push out those things every two years or so (Datasets, Typed datasets, Linq2Sql, EF, RIA Services,...). I guess they need to because they always must satisfy RAD (or drop-database-table-onto-form type development) in the best possible way with the current technology.
herzmeister der welten
I whole heartedly agree with the "Don't Use MS RAD Technologies" statement. Hand coding ADO, and the current ADO.Net, has always been my way of doing things. It still works, is still performant and robust, and is still IMHO better than any of the other ways which have been introduced and flamed out.
Chris Lively
+2  A: 

For me Delphi's IntraWeb was it.

Ricardo Acras
+7  A: 

QBASIC never really took off. I spent years learning it too.

OK, to be fair it was my first language and a good way to learn. And it was later replaced by Visual Basic, then VB.NET. So it wasn't a complete waste of my time. ;)

Most of the time even if a language doesn't "take off" exactly, it's still a good learning experience that can be applied to something else.

Steve Wortham
I never did figure out whether QBASIC and QuickBASIC were the same thing or not.
Kyralessa
AFAIK QBASIC was a stripped-down version of QuickBASIC released together with MS-DOS 5.0. I think QBASIC was compatible with QuickBASIC code, as well as GW-BASIC (supposedly Greg Whitten's own BASIC interpreter) and BASICA. Microsoft has a long love story with BASIC, definitely ;)
Adrian Kosmaczewski
yes, QBASIC was compatible to most extend with QuickBASIC. But QuickBASIC was compatible with GW-BASIC and BASICA (in that order). Remember that old BIOS'es came with a full BASIC BIOS. Microsoft used QBASIC as a teaser to have you buy QuickBASIC. They made QB interpretive with an approximate 20x slower performance then QuickBASIC. First teaserware? I still have my QuickBASIC, QBASIC and even PowerBASIC (still exists!) books. Nice reminiscences ;-)
Abel
I don't think this counts. The existing technology (various basic versions) was in heavy use, and qbasic was a lite version of quickbasic which was used (to a lesser degree) the way vb was later used. I know a small IT support firm was using a qb app for time tracking in ~2004.
Draemon
QBASIC lives, in the browser, no less: http://stevehanov.ca/blog/index.php?id=92
Brian Campbell
+25  A: 

I can write a pretty good Java Applet. All technologies will fall by the wayside eventually, but this one had a very sharp rise and fall.

Bill the Lizard
I'd like to see it! :-)
Ken
But how did you get burned by using Java Applets? They might not be hip anymore, but they still work fine, AFAICT.
Peter Eisentraut
@Peter: I'm not exactly getting a deluge of job offers based on this skill.
Bill the Lizard
@Peter I generally leave Java and Flash turned off in my browser, unless I need to use them for something (for security reasons). Flash, I need to turn back on an average of once a day. Java, once every two or three months.
Brian Campbell
We were doing a WBT (Web Based Training) system for Sun back in 1997 (maybe even '96) and they *had* to have some Java in it. Why? Because ... it was *Java!* We were trying to simulate packets moving though a firewall. It should have been straightforward but it turned into a nightmare of wrong documentation, buggy JVM code, etc. I *hate* pretty much anything before 2.1. Django has been the one shining exception to that.
Peter Rowell
+3  A: 

I was once forced to used witango, but I'm getting over it.

Galwegian
Care to explain why you did not like it? I'd be interested, as I never heard about it?
sleske
+2  A: 

Blackbird.

A wonderful development environment for creating interactive content for MSN.

chris
Wasn't that sort of the precursor to .Net/Silverlight etc
TFD
No, it was mostly for creating proprietary content for the MSN service.
chris
+6  A: 

Delphi.NET. Still have a tic when I hear that!

Conrad
I went through 3 versions of it before finally giving up. ECO was especially caustic. Never again. Although I hear that the newer non-.NET flavour has been vastly improved.
Aaronaught
+5  A: 

Hell yes

I'm currently feeling the pain of being an early adopter of Fortran 2003 :-)

Mark

High Performance Mark
+4  A: 

I'm incredibly close to the flame everyday by being an early MonoTouch adopter. I never know what's going to happen next with this framework. But to its credit, the Novell team is standing by with fire extinguishers just about 24/7 :)

Matt Greer
+13  A: 

Anybody else remember OpenDoc, Apple's idea for how all new Mac applications would be written? Didn't think so.

David Thornley
+1 not for OpenDoc, but for Cyberdog!
Seth
@Seth Yep, I got burned by Cyberdog. Had all my email stored in there, in a proprietary binary file format. When they cancelled it, there was no way to get my email out...
Brian Campbell
Cyberdog is the compelling new internet access technology.
Coxy
+1  A: 

This addresses your discussion more than your question. I think you are assuming that the cost benefit of adopting new technologies is a given. For a very large corporation, changing technologies can cost hundreds of millions of dollars. If the cost benefit is not there then the hundreds of millions can be saved. Most companies use technology to make something else and can not afford to consume new technology simply because it exists. When the cost benefit is there, then it makes sense to do so.

fupsduck
A: 

PHP5.

I, personally, think it was a bad move from the language developers do not quickly dropping PHP4 support. I had a hard time to convince server admins switching to the new version because it was "immature".

I still have a website running on good and ol'PHP4 for this reason.

Hilton Perantunes
I don't think immaturity was the main reason for providers and admins not to switch, it was the fear from enraged customers when their PHP4 based web site/cms/shop system wouldn't run any more.
Pekka
@Pekka: You're right. But most modern opensource PHP apps adopted version 5 after a few years. By not dropping version 4 development, PHP language mantainers encouraged system admins to think as "why should I bother to set up an additional PHP5 environment if we still have the active, supported and most excellent PHP4 at hand?"...
Hilton Perantunes
There's such a thing as "good enough". PHP4 reached that point, thereafter, major changes always have trouble gaining majority.
Matt Joiner
+9  A: 

Scala.

It looks great on paper, so I wrote a project with it while making sure to keep my Scala version up-to-date. The version number (2.7.x) and its years in development made me feel relatively secure doing that.

Well, I made a mistake. The problem? Serious lack of documentation and code samples, as well as ever-changing class library (twice during my work, previously-working code started getting "deprecated" warnings... and I'm talking over the span of a few months and similar version numbers).

I can't say I lost much (this was a private project) but I will not touch Scala in the near future. I still think it's a very nice, promising language, though.

Oak
The tools are still awful, too.
skaffman
+5  A: 

Mozilla XULRunner.

It was Adobe AIR, before there was AIR. We wrote our Human Resources Management system using it. At the time XULrunner was "just about" to released as the underling engine for FireFox, so we expected that all we would have to do is make sure our users had FireFox installed.

2 years into the project, and right before deployment, a new XULrunner came out that completely broke all of our code, and a Firefox deployment was nowhere in sight. We ended up deploying on our older version with a dedicated desktop installer and have been using it ever since, without the benefit of security or performance updates because we would have to re-write too much code to be compatible. Despite that it has been a very successful project with our customers.

We're now re-writing the app to run on Ext which is the new hot thing for us but seems to have more community support, and offers commercial support if we really get stuck on something.

Mark Porter
Would you recommend XULRunner today for developing new desktop applications? (I ask because I'm considering to use it for a professional project.)
StackedCrooked
I've been burned so I'm pretty biased. Having said that, I would use Adobe AIR for the kind of things XULRunner is good for. It runs reliably on win,mac, Linux, has a good support commitment from Adobe and is free to develop/distribute.
Mark Porter
+18  A: 

I'm currently in the process of getting burned by Microsoft Office Word 2007's CustomXML support.

CustomXML allows the document to have custom defined elements that can model business data etc. For example, you could define an XSD with your custom elements, associate it with a docx file, then generate the placeholders as CustomXML tags and navigate/modify the documents using C# (or other .NET languages) and the OpenXML SDK. The benefit of OpenXML is that it decouples the need to have Office installed on a server machine for automation purposes and is an alternative to purchasing 3rd party libraries.

In short there was a lawsuit regarding Word 2007's ability to open documents with custom defined XML. From this article:

On August 11th, the company received an Office Word sales injunction ...

"This injunction applies only to copies of Microsoft Word 2007 and Microsoft Office 2007 sold in the U.S. on or after the injunction date of January 11, 2010. Copies of these products sold before this date are not affected."

Microsoft's response is to remove support for CustomXML from future versions of Word and is releasing a patch that would entirely remove this capability. Here is the link to the official update. According to this Microsoft OEM Partner Center site:

The following patch is required for the United States. The patch will work with all Office 2007 languages.

After this patch is installed, Word will no longer read the Custom XML elements contained within DOCX, DOCM, or XML files. These files will continue to open, but any Custom XML elements will be removed. The ability to handle custom XML markup is typically used in association with automated server based processing of Word documents. Custom XML is not typically used by most end users of Word.

I imagine a tiny percentage of end users and developers make use of it, so I consider that last sentence to be accurate. The problem is there's currently no word (no pun intended) on how to move forward for projects that did utilize this technology. CustomXML is the cornerstone of a large project I'm currently working on. The impact of this decision is not positive and it effectively prevents any forward compatibility as there's no equivalent alternative approach that maintains the structure that CustomXML provided.

Some of my coworkers and I have a wealth of knowledge on the topic... I guess it's good we didn't get around to writing blog posts about it as we had planned :) We've accomplished some pretty impressive feats with this and the VSTO, but this news is disappointing.

If anyone's interested in this topic here are some articles to check out:

ZDNet articles:

BNet articles:

Softpedia articles:

EDIT: added link to the official update.

Ahmad Mageed
Go read the patent in question: http://i.zdnet.com/blogs/msfti4icomplaint.pdf (see exhibit A). AFAICT, it's nothing more than an algorithm for splitting apart the tags and content of a markup file (e.g., an xml file). The tags are stored in a lookup table along with their original location, so that the can be recombined with the content to produce the original document. It seems like much too simple an algorithm to be patentable - I have no idea how Microsoft could have lost the case.
Cybis
@Cybis thanks for the link. I had seen their diagrams elsewhere but missed the actual documentation. The patent process is somewhat ridiculous. Props to them for coming up with it and all, but sometimes what gets patented is just a hindrance.
Ahmad Mageed
@Cybis: A bit like FAT then?
Draemon
I really really hate the US Patent process. It's a joke and has caused no end of grief for companies of every size. At this point I fully believe no software should be patentable as there are many ways to differentiate yourself in the marketplace and to take over if possible. Imagine what would have happened if someone patented the "ability to retrieve documents by searching for patterns based on input from a user". We'd still be in the dark ages.
Chris Lively
BTW, MS lost because they spoke to the company (and signed NDAs) about partnering prior to developing their own solution. From all appearances it looks like they brought the company in simply to figure out how to do it themselves. The patent was only one part (and probably a small one) of the reason why they lost.
Chris Lively
+5  A: 

Java

I was very eager to start working on it in 1996 and used it for several projects. But for web development I always preferred Perl and these days PHP. GUI development I ended up mostly using .NET. For the few command line programs that cannot be handled by scripting I prefer to use Perl, Python or even for that PHP.

Few of the Java programs I wrote were used over long periods of time, while some of my pre-java applications are still in use.

I think the main reason for this is that it always took longer to develop something in Java than using another programming language: so the resulting applications contained less features and were easier to replace.

As speed of development is usually an issue for my customers Java tends to end up as the second choice.

Matijs
Wouldn't this have been a problem even if you'd tried Java at a later date, when it was mature? It just sounds like a bad match for your requirements, not an early-adoption problem.
Jeff Sternal
Nope. In those days Java often was the best choice - those other languages didn't exist. But other languages learned from Java overtook it - at least for my line of work.
Matijs
I went down the java path about the same time as you. When MS and Netscape were battling over implementation. It was a total nightmare. Each week saw updates from both browsers which sometimes improved support and sometimes broke things that were working before. After 3 months of trying to get a project going I finally chucked java out the window and wrote the app I was working on in ColdFusion.. The CF version took 2 weeks to complete and was the only way for us to meet our deadline. Incidentally, the final straw was when sun decided to rewrite the db access layer.
Chris Lively
A: 

Yes, some other programmers from team "tested libraries". "It will do exactly what you need!". And then I start use that It seems that actual library is crashing instead of my code. And that happens pretty much once in a month or so.

in our team we're starters in TDD so maybe it's normal, but it burns a lot anyway :)

Lukas Šalkauskas
A: 

I was really interested by Model Driven Developpement, so I've used once for an acadedmic work, the Leonardi softwares, which was free for downloading.

After working on this software for a while, I've noticed the lack of documentation, and absolutly no support (nothing free).

I had to finish my app on Leonardi, providing a very cheap work !

So, don't you new technology unless you're sure you'll have enought quality documentation & support- that you can afford !

iChaib
+2  A: 

True Basic

In the mid-1980s we were looking for a development platform that would work on the various DOS implementations and not be as "bit-twiddling" a language as C was.

We found True Basic, advertised as having been created by the original creators of BASIC back in 1964. Here was a language that 'compiled' down to p-code. Not only would it run on DOS machines, it ran on GEM (Atari-ST) and Amiga boxes.

It had add-ons much like we were used to having with development environments on the VAX/VMS machines we used. Things like Forms packages, an "ISAM" add-on (before the days of callable databases on PCs), etc.

Unfortunately, the multi-platform abilities never sold the language enough. Heck, according to Wikipedia, there's a Mac OS version (though not OS X or Snow Leopard). I even found the 'current' TrueBasic page while writing this note.

Eventually Visual Basic 1.0 came out and all the BASIC programmer, like myself, checked it out since it had Microsoft's name on it. Now, of course, 10 versions later, we've been steered over to the .Net platform while TrueBasic sits at V5.5.

David
+4  A: 

AzMan (Microsoft Authorization Manager)

We started using this on a public web site/web app, enticed by dreams of single-sign-on and claims of being able to "leverage your existing infrastructure" or whatever the marketing speak now says. A drop-in solution for ASP.NET that sysadmins could manage without having to develop any tools or write any code at all. It was win-win, right?

We learned several things as a result of our decision, none of which we wanted to learn:

  • Active Directory itself is not a very good choice for an authentication mechanism servicing a public web site. Not that it isn't capable - it's quite capable, but it's like hiring a Ph.D to write a "Hello World" app. It's overqualified, it does way more than you could ever need in such a context, it's much more difficult to work with than a plain old SQL table, and requires a lot more maintenance.

  • AzMan is slow. Very, very slow. The role provider has to maintain a cache, which should tell you just what kind of performance we're talking about. I never did fully understand why it was so slow, but I imagine it has something to do with the hornet's nest of COM and network protocols it depends on.

  • A cache (see above) can be a very dangerous thing when you have little to no control over it. When we added new users manually (i.e. through an administrative application as opposed to the site itself), those users would end up with a "not authorized" screen until the cache expired and they logged out. Sometimes this would even happen to users who self-registered online; we never did find out why.

  • The tools were horrible. Take a brief look at the AzMan console if you don't believe me, or read some of the documentation if you really want a headache. Why should anything be so complicated?

  • It was flaky. A lot of times the provider would just stop working, spitting out cryptic COM errors (a different one each time!) and we had to restart IIS or even the entire web server to get it to cooperate again. We also had a domain trust set up - because obviously we didn't want 50,000 public user accounts on our internal corporate domain - only problem was, administrators had to log in to administrative accounts on the secondary domain to manage roles because the console would fail in mysterious ways if you tried to use it from the primary (even as an Enterprise Admin with Domain Admin rights on the secondary domain).

  • Support was practically nonexistent. If you use the basic SQL Server role provider (which we don't, but just as an example), there are 10 million tutorials and you can Google for any error message or ask any question on any forum. Whenever something went wrong with AzMan or we wanted to do something new, it was a constant struggle to find good information.

  • Code integration was awkward. You had to go through a bunch of messy COM layers and the interface sucked. If I recall correctly, there was no way to just do a simple authorization check - you had to download the entire app/role registry. This was a long time ago though, so my memory might be foggy on that aspect.

Eventually we couldn't take it any longer and decided to rip out the entire system for a homegrown one based on a couple of SQL Server tables, which is probably what we should have done from the get-go. Migration was painful (see the two points above), but we got it done, and never looked back.

Aaronaught
+1  A: 

The TurboGears web framework

I had a web app to write and jumped onto this (having heard about it from a friend). I wasn't really aware of the alternatives, didn't know MVC properly and wasn't aware of the alternatives to the various 'standard' components (eg. SQLAlchemy instead of SQLObject). While the documentation and general state of the project is far better than it was when I got my hands dirty, I ended up with a huge application that relied on 'tricks' to bypass some of the magic features and had lots of undocumented features in it to meet the deadlines. It became a maintenance nightmare and I really wish I had taken the time to build something simpler with plans for a rewrite if the requirements changed.

This was 1.x series which has been deprecated now for the Pylons based 2.x series. As you can imagine, the core team itself decided on a rearch but I was stuck with a legacy application which I had to maintain.

Noufal Ibrahim
+4  A: 

For lacking market presence:

  • Python3000
  • Mercurial
    • lack of support, it's quickly catching on, but not as ubiquitous as Subversion
    • conversion and interoperability tools are still severely lacking to make it a viable drop in improvement over Subversion
  • C++0x
  • C99

For poor quality:

  • Windows Vista

For lagging behind upstream:

  • PyGTK on Windows

Note that my listing these technologies in no way suggests that they're no good, I'm a huge fan of all of these (except the poor quality ones). My opinion on being burned by these technologies is first hand (usually me trying to push them as replacements for existing technology, or simply running into barriers after a significant investment has already been made.

Matt Joiner
How exactly did you burn your hands with Mercurial?
jetxee
Did you actually get burned by any of these? Or are these things that you consider to have the potential to burn someone?
Brian Campbell
i hope my changes answer your questions
Matt Joiner
+1  A: 

Windows Open Services Architecture (WOSA): http://en.wikipedia.org/wiki/Windows_Open_Services_Architecture

The foundation for ODBC, MAPI, TAPI, etc.

Erwin
+3  A: 

Not programming but still a newer technology blunder - I nearly lost a nipple to my first mini-ATX build, moral of that story is to never lean over a case while trying to forcefully close it when it gets jammed...

mynameiscoffey
ha! i'm about to do a mini-itx build in the next few weeks. a great story from the trenches
Matt Joiner
I've never taken something apart so fast in my entire life, gives an all new meaning to "bleeding edge" - be careful!!
mynameiscoffey
+5  A: 

Unfortuantly it cuts both ways. When we first started developing a large web-based app for on Windows, .NET had come out in beta - with a final release of .NET 1.0 not long away.

However because it was new, and we didn't know what was going to happen, how popular it would be, and whether MS would drop it six months later. So we stuck with the tried-and-tested VB6.

We're still having to maintain that VB6 legacy, and it's been restrictive for a while. Although it's not listed anywhere, we're getting paranoid that support for the VB runtime is going to be withdrawn at a given version of Windows.

That said, going the .NET route may have had its own pain: 1.0, 1.1 and 2.0 came out fairly quickly after each other, each with (some) incompatibilites with the previous version. Thus having to migrate .NET platform would have carried a different risk. Less or more? Can't answer that one having not experienced it :-)

In the end, you can be damned if you do and damned if you don't. If someone can read the entrails to determine whether a given technology is going to succeed at any one time, then they shouldn't have a job in Software, and should probably go into hedge fund management instead, make loads of cash and retire early :-)

Chris J
I stayed away from .net 1.0. After reviewing how you had to access db results by ordinal position, instead of by named column, I knew they still had a ways to go in order to make it work.
Chris Lively
A: 

Web push, J2EE CMP

admin
A: 

iBatis.

We end up implementing a lot of the things Hibernate has done already. In essence, we developed our own internal JPA provider that is non-standard using iBatis. The technology was already picked so there wasn't much we could do.

In order to lessen the blow, we have refactored code to avoid duplication of effort.

Walter

+6  A: 

The worst is when you get 80% through a project using a new product and hit a showstopper bug.

Back in the mid-80s my boss suggested I try a new dBase alternative called KnowledgeMan. It was far along when I realized that some crucial bugs I thought were mine were actually theirs. The whole thing had to be redone from scratch; it cost me my job.

egrunin
+4  A: 

Yes. I'm a Lisp programmer: everything looks new and immature to me. :-)

Ken
A: 

MongoDB. I used it at the core of a new product and then found out that it doesn't rsync with each write to disk. So, it has a chance of corrupting data if, for example, the server loses power.

Clint Miller
+4  A: 

64 bit Carbon APIs on Mac OS X: I didn't get burnt personally on this, but I have a friend working for a big software company that spent a year converting almost all of their code to use the 64 bit Carbon APIs only to find out at WWDC that those APIs were no longer going to be made available.

Lyndsey Ferguson
+2  A: 

Windows Azure's .NET Services... They canceled it.

Shawn Mclean
A: 

What's even worse is when you have your company adopt new/immature software that you yourself developed. At first, when I was the only developer, my nifty HTML GUI framework worked just fine for building our RIA. However, once we got additional developers, I saw just how much lower developer productivity is when not using a proven platform with a strong community behind it. Due to this, and the fact that I eventually realized how unsuitable HTML 4 is for RIA's, we converted over to Adobe Flex 3. I'm very happy with the move.

Jacob
+2  A: 

VBA - We spent a lot of time integrating it into our product. We still spend a lot of time on each new release to make sure that we don't break anything. VB6 and VBA is also COM based and that is a problem if you want to run as a standard user and not have write access to the registry.

Arve
A: 

I was burned by:

  • gcc before it got over its hump, and MSVC++ after that
  • proprietary sun c++ compilers
  • Microsoft Developer Network
reechard
+2  A: 

Anyone noticed the trend here? The majority of technologies here were created and canceled or modified by microsoft...

I also have been burned by microsoft with changes made to the entity framework.

DVark
A: 

C++, but it was not me who was the adopter: Back in the late 80's/early 1990's, when C++ developers were few & far between, I got a job doing the front end on a project. The manager in charge of the project brought in a buddy of hers as a contract programmer to do the back end. At the time, I knew nothing about C++ or OOP in general. Even though the UI package we used couldn't do C++, he insisted on using it for his part. Later, when both the programmer and the manager were long gone, some bugs were found; I fixed those in my code easily enough, but couldn't do much but shrug my shoulders at the bugs in the other guy's C++ code. Instead of training me, the management in place at that time showed me the door.

GreenMatt