views:

87

answers:

2

This is a vague, open question, so if you have no interest in these, please leave now.

A few years ago it seemed everyone thought the death of desktop software was imminent. Web applications were the future. Everyone would move to cloud-based software-as-a-service systems, and developing applications for specific end-user platforms like Windows would soon become something of a ghetto. Joel's "How Microsoft Lost the API War" was but one of many such pieces sounding the death knell for this way of software development.

Flash-forward to 2010, and the hype is all around mobile devices, particularly the iPhone. Software-as-a-Service vendors--even small ones such as YCombinator startups--go out of their way to build custom applications for the iPhone and other smart phone devices; applications that can be quite sophisticated, that run only on specific hardware and software architectures and are thus inherently incompatible.

Now some of you are probably thinking, "Well, only the decline of desktop software was predicted; mobile devices aren't desktops." But the term was used by those predicting its demise to mean laptops also, and really any platform capable of running a browser. What was promised was a world where HTML and related standards would supplant native applications and their inherent difficulties. We would all code to the browser, not the OS. But here we are in 2010 with the AppStore bulging and development for the iPad just revving up. A few days ago, I saw someone on Hacker News claim that the future of computing was entirely in small, portable devices. Apparently the future is underpowered, requires dexterous thumbs and induces near-sightedness.

How do those who so vehemently asserted one thing now assert the opposite with equal vehemence, without making even the slightest admission of error? And further, how are we as developers supposed to sift through all of this? I bought into the whole web-standards utopianism that was in vogue back in '06-'07 and now feel like it was a mistake. Is there some formula one can apply rather than a mere appeal to experience?

+3  A: 

Eh - this isn't anything specific to computer programming. It's just the nature of our world; we don't know what the future will hold, but it's in our best interest to attempt to plan for the future.

When people are wrong, few will admit it. And fewer will go out of their way to remind people of their previous claims. Some people will even use their previous failures as justification for how CERTAIN they are this time.

As a developer, or as a stock investor, or as anyone with a vested interest; I think it comes down to risk vs. reward. If you are convinced that X is going to be the next great thing and you are willing to assume a lot of risk - you will go out and develop, design, build, etc, etc whatever - and you'll do it on your own dime. If you are right - you'll stand to make one hell of a reward. On the flip side, if you are wrong, you'll be screwed.

If you don't want to deal with the risk/don't feel like you've got more insight than the rest of the population, then you wait. You do the status quo and whenever the next big thing is clearly here, you can adopt it. I'm a pretty conservative guy, so I try to split my time between web apps and desktop apps in .Net and Java. On the downside, when someone looks at my resume they see a guy who has done a few different things; not someone who is an expert at the one thing they want. I just see it as risk vs. reward; same as investing in stock.

Maybe I'm misunderstanding your question, so I apologize if I've missed the point.

Rob P.
+1  A: 

The sea of hype is just that - a sea of hype.
Any human being with a pulse has at least one agenda, and they are what drives the hype, especially when agendas converge.

Purveyor of technology X wants you to believe in technology X, and bet your resources on it, because that will strengthen the market for technology X, nothing really evil about that, it's just commerce, which is largely about belief.
The more people believe,..., the more people believe.

At one stage, java was going to enable networked toasters, thankfully that did not happen, and I do not have to go to my mother's house just to reboot the toaster.

A crystal ball would be great, and perhaps iBalls will be the next great thing.
Currently, there are no crystal balls, until Steve says so.

I dont think time spent on web standards was wasted, but it was never going to be the "one true way" - as soon as there is "one true way", something else will try to subvert it, in order to gain an upper hand commercially. "One true way" is at best ephemeral, more often naively idealistic and just-not-so.

It would seem that devices like the iPad are the new ground for content consumers, for those that aren't really that into computers per se, just want to surf the web and tweet about what they had for breakfast; desktop workstations may well be relegated to working roles only, and slowly disappear from non-techie households, being replaced by netbooks, iPads, xBoxes, smart televisions...

I won't proclaim that to be a universal truth, just my current guess, and I reserve the right to change my mind - don't shoot me if I do.

Don't believe anyone that claims they know for sure.

As Rob said, is a lot like investing in the stock market, you have to make up your own mind, and take risks accordingly. Some you win, some you lose.

seanb