views:

10350

answers:

187

I'm at the beginning of my career and there are lots of things which are being touted as "The Next Big Thing". For example:

  • Dependency Injection (Spring, etc)
  • MVC (Struts, ASP.NET MVC)
  • ORMs (Linq To SQL, Hibernate)
  • Agile Software Development

These things have probably been around for some time, but I've only just started out. And don't get me wrong, I think these things are great!

So, what was "The Next Big Thing" when you were starting out? When was it? Were people sceptical of it at first? Why? Did you think it would catch on? Did it pan out and become widely accepted/used? If not, why not?


EDIT

It's been nearly a week since I first posted this question and I can safely say that I did not expect such explosive interest. I asked the question so that I could gain a perspective of what kinds of innovations in programming people thought were most important when they were starting out. At the time of writing this I have read ~95% of all answers.

To answer a few questions, the "Next Big Things" I listed are ones that I am currently really excited about and that I had not really been exposed to until I started working. I'm hoping to implement some or all of these in the near future at my current workplace. To many people they are probably old news.

In regards to the "is this a real question" debate, I can see that obviously hasn't been settled yet. I feel bad whenever I read a comment saying that these kinds of questions take away from the real meaning of SO. I'm not wholly convinced that it doesn't. On the other hand, I have seen a lot of comments saying what a great question it is.

Anyway, I have chosen "The Internet!" as my answer to this question. I don't think (in my very humble opinion, and, it seems many SOers opinions) that many things related to programming can compare. Nowadays every business and their dog has a website which can do anything from simply supplying information to purchasing goods halfway around the world to updating your blog. And of course, all these businesses need people like us.

Thanks to everyone for all the great answers!

+44  A: 

I don't really remember. I was too busy programming.

Noon Silk
I don't remember either, but it's more because I just wasn't paying attention.
Michael Myers
@mmyers: Good luck with that. ;)
Jim G.
+44  A: 

has to be Java, circa 2000.

aaa
yep, that was me. No more dealing with memory. Just write and go!
Alastair Pitts
Java was about 4-5 years earlier than that.
Tim
@tim: Java didn't really hit until 1998. Heck, in 1995, it was still called Oak, so 5 years before 2000 is a bit early on that one. :-)
Dean J
@Alastair - I still have Goslings paper for the first version. I tried it, looked at the TCL/TK quality of the gui and went back to Motif!
Martin Beckett
I remember creating my first classes in java 1.17 and having to wait for hours to download Java 1.2 SDK
OscarRyz
Hmm, I distinctly remember Java in 1996 and according to this: http://www.java.com/en/javahistory/timeline.jsp the name was Java in 1995. An ex-co-worker published this book http://www.amazon.com/Developing-Java-Beans-Robert-Englander/dp/1565922891 in 1997.
Tim
@tim - Agreed: 1995. Near the end of that year I remember academically using it. And then in 1996 making applets.
John K
Guys, I think "circa 2000" means when he started, not when Java started.
Matt Olenik
@Matt, He may have started in 2000, but Java was the next big thing late 1995/1996. By 2000 the "cross-platform / applet hype" had really died down.
Ash
@Dean J, @tim is absolutely correct, you seem to be confusing your dates. Java was taught in my CS course late 1995 into 1996. I distinctly remember at the time how it was going to save the world through "write once, run anywhere" and saving us from C/C++.
Ash
@frungash, @tim, @ash: so, by the timeline that frungash posted, it was still called Oak at the beginning of 1995. I don't really feel wrong on saying it was still called Oak in '95. It didn't really hit - in my experience - until the late 90's, and then hype slowly died back after 2000.
Dean J
hello good people. I first heard about Java around 98 (applets). Around 2000 I went to university right about the time Java became teaching language instead of Pascal. So for me, and many other people, biggest Java hype was late nineties, early 2000s
aaa
I went to university in the early 2000s, and remember a freshman year course where Java was touted as basically the answer to everyone's prayers.
MAK
+28  A: 

It had already been out for a long time, but C++ was still the next big thing, and there were endless debates about C++ being slower than C because it more easily enabled things like object oriented design.

Then later when I hit University the next big thing was Java. And there were endless debates about Java being slower than C++. Apparently everyone's toaster would be running it. Still waiting for that.

Brian R. Bondy
@Brian: I liked the "Java toasters" bit in your previous edit. In fact I went on google and found you two: http://www.theregister.co.uk/2001/06/04/bread_as_a_display_device/ and http://www.embeddedarm.com/software/arm-netbsd-toaster.php
Daniel Vassallo
@Daniel Vassallo: I reverted back to that answer. Haha I guess you showed me wrong.
Brian R. Bondy
@Brian R. Bondy: Well... At least it's impossible to buy a Blu-Ray player without Java for it's part of the Blu-Ray specs, Java is mandatory. In addition to that you have entire countries (like Belgium) where every person is carrying on him a Java national ID smartcard [TM] in his wallet other countries like Brazil where the medical care system and medical care personal cards are Java smartcards too :) Every single cellphone besides Brew and Apple ones are Java too, etc. Java is the biggest language success story of these last 20 years by a fair margin... A toaster is coming I'm sure ;)
Webinator
+1: Had a laugh. I hear debates about C# being slower than C++/C now. I didn't start programming until about 4-5 years ago myself, so I wasn't around for the "new" C++ era.
Zack
What's interesting is that the current "next big thing" of the last couple of years is Ruby-on-Rails and it is still plagued with endless debates about how much slower it is than Java.
Taryn East
+141  A: 

The next big thing when I was starting out was the Internet.

  • When was it? :: circa 1995.

  • Were people sceptical of it at first? :: not really (dot-com bubble).

  • Did you think it would catch on? :: yes but not with that explosive growth.

  • Did it pan out and become widely accepted/used? :: the Internet might have become the cornerstone for many "next big things" that followed.

Daniel Vassallo
I was starting programming as the bubble was bursting.. and we weren't told about it, and just to ignore 'all that silicon valley kuffluffle. Here, have some more html constructs'.
glasnt
You young'uns! :)
Plynx
Youngun, no doubt, back in my day, I had to crawl up the mountain to use an abacus. You kids with fancy-schmancy punch cards.
Milan Ramaiya
Yah, I started up in '90 when the Interwebs were finally out of DARPA and released full scale into the university system.
Joel Etherton
I think that there internet fad may just catch on.
Austin Fitzpatrick
punch cards were a blast -- like a dot matrix you wanted to 'upgrade' with a sledge-hammer
Hardryv
I still think Al Gore invented the Internet.
John K
When I started out, there was a huge hype/debate regarding client/server architecture, that was 1993. Today when webapps and ajax are the foundations on which most new stuff is built, imagine how it was back then when even TCP/IP was obscure...
Ernelli
Six months after I started out, I had to write my first management memo justifying the purchase of a $900 30MB hard drive...that's right, I said 'MEGAbyte'. Today, I'm wearing an 8GB flash drive on a lanyard around my neck...
Neil T.
You kids get off my Internet!
S.Lott
"Were people sceptical of it at first? :: not really" - i don't agree. MS was very sceptical at first. They even tried to create their own network (which later evolved into MSN).
el.pescado
You started out in 1995 and the internet was the next big thing!? Are you kidding me? Though the internet boom didn't happen til a few years later, I was in elementary school and had been using the internet since 1990 - it was there and already a huge deal, but it wasn't til HTML2.0, new IP standards (post 1996), and additions to the net backbone that it was the next big thing. I think in 1995 Video CDs, CD rippers, torrents, but more importantly mp3 was more of the next big thing.
vol7ron
+58  A: 

32-bit address space.

dtb
Finally, no more segments! God I hated those segments...
LiraNuna
... Who didn't? I mean, seriously, pointer comparison considered harmful?! `near` and `far` pointers? Thank `${DEITY}` that brain-death didn't make it into POSIX.
Mike DeSimone
*sigh*... if only pages weren't such a popular way to expand available memory on aging DSP and embedded processor families. I guess it's better than simply saying "Sorry, no more memory for you!"... but not much.
darron
@LiraNuna, your profile says you're 21. I haven't done any at segment/offset addressing since VGA programming in about 1995. You would have been 6???
Ash
@Ash: Long story...
LiraNuna
@LiraNuna Really, no more segments? How then do we determine which IDT entry to look at, and what privilege level the current code is running with?
asveikau
I am 21 and I actually do remember the entry of 32-bit, or at least reading about it. I caught the end of it reading a VB5 programming book explaining how wonderful it was. Admittedly, it was 1999/2000 when I read said book. I was using it to learn VB6 to write silly programs when I was 10. Yes, I could do (very) rudimentary programming before I went to secondary school (UK). My primary school had Acorn computers.
Ninefingers
Although let's be honest I had no idea what it meant, just that my book rabbited on about it.
Ninefingers
@Ninefingers: Be very glad that you didn't know. It was utterly horrible unless you wrote programs where everything was in the same single segment (which limited you to 64kB total).
Donal Fellows
Would that be .model flat in nasm-speak? I think I am glad I didn't know about it.
Ninefingers
+49  A: 

OO - there was this C++ stuff coming along, of course you would still need C for 'real' work.
And there were lots of new machines/OSs that were going to finally replace Unix !

Martin Beckett
+1: The internet existed, but was restricted largely to academic sites ("Usenet is not the internest", www not even a dream yet). Office LANs were not de-rigeur. And a 40MB HDD was large. MS-DOS 3.3 was the normal "OS".
Richard
I did a university course around 1995 in which OO (using Eiffel) was one part of a "niche programming paradigms" 1 term module, alongside functional programming and declarative programming.People only a couple of years younger than me used GOF Design Patterns as a textbook for their main programming module, so OO took of pretty sharply just after I graduated.It took me years to catch up.
slim
+2  A: 

When I started: OO programming. And shortly after that the Internet. (The internet might have existed earlier, but nobody cared where I lived).

Carsten
+17  A: 

Structured Programming.

Ferruccio
+11  A: 

OO programming and GUI programming with the rise of Windows 3.1

gyurisc
yup, I remember thinking windows would never catch on because it was too hard to write a windows app. There sure are a lot of lines in a windows api version of "hello world".
ScottS
There was also Turbo Vision with ASCII art windows in text mode :) But then Windows changed everything. I still have somewhere the book "Undocumented API functions in MS Win 3.1"...
m_pGladiator
I did use Turbo Vision and Pascal OO. It was fun :)
gyurisc
+38  A: 

.net

It was just starting to catch on... and the first few apps were starting to hit the market. Up 'till then, pretty much all commercial apps were C++ / Windows API.

George Edison
Yup, I remember when I thought .Net was just some fad thats going to die out in another year and that it was a pain to have to install .net 1.1 to get some kinda plugin(for programming) I used to work.
Earlz
I thought .NET was some lofty idea with no real content when it was announced. But it's turned out fantastically and keeps getting better.
PeteT
I'm still not convinced that it isn't going to die out. Every time I get more deeply involved in it, I loathe it more. MS cut so many corners when they designed it.
rmeador
"Pretty much all commercial apps were C++/Windows API"?? I think there may have been a few (million) commercial Windows apps written in VB6...
MarkJ
-1 for making me feel old!
Joel
@rmeador: tried Qt?
darron
@Joel - Yeah, it makes me feel old... and I'm 25.
Adam Jaskiewicz
@MarkJ: But commercial apps? I don't know about that. Maybe there were a few - anyway, that's not the point.
George Edison
@rmeador: I hate .net. There, I said it. See my profile for more info. (and more ranting)
George Edison
What is your definition of *commercial*? At that time, I personally worked on a few VB6 apps that were sold (it wasn't inhouse development): also some small parts of Microsoft Office were written in VB6 - nothing more *commercial* than that.
MarkJ
@MarkJ: I guess I should clarify. I don't mean NO commercial apps were written in VB6, but that there were few. My bad.
George Edison
@rmeador: cut so many corners compared to what? Java? MFC? The Win32 API? Do you refer to the BCL, the languages, the tools, or something else? I can't imagine; after 12 years writing consumer/business software for DOS and then Win32 using C and C++, in 2002 I moved almost exclusively to C# -- it's a far more productive environment, and I'd exhort anyone not writing games or drivers to do the same.
Ben M
+84  A: 

The wheel. Before that, fire.

Juliet
Did you have to move bits uphill both ways, with only blinding snow displayed on your 3 nanometer CRT monitor?
dsimcha
@dsimcha: I was about to up-vote your comment, but I don't think we can even make 3 nanometer CRTs now. Smaller is actually more advanced at that scale.
Wallacoloo
How long ago was that? Never heard wheel and fire programming.
fastcodejava
I am almost always programming when a wheel is invited. And again. And again. And again.
Dykam
Impossible ... you are not Jon Skeet.
Dean J
You had it easy - we had Vaxen.
Martin Beckett
@Martin: There are six of those where I work. I was very nearly assigned to be their next sysadmin.
Michael Myers
Oh, so you knew this guy: http://xkcd.com/505/
Mike DeSimone
@opc, meant the wheel is invited again and again.
Dykam
not really funny
Alex Baranosky
It wasn't wheel and fire, it was stone knives and bearskins!
Arthur Kalliokoski
+14  A: 

The shift from using desktop software to using web applications. Turned out to be the real deal.

I first noticed this when I ditched Eudora for Yahoo! Mail in the late 90s.

William Brendel
IMHO, this switch will not be complete until providers start offering fully encrypted remote data storage (i.e. even the provider can't get at the data without your key). I won't trust my important data to someone else if it can be accessed by anyone but me, and I know a lot of companies share my view.
rmeador
If someone would have asked me when that shift happened, I would have said Gmail. Thinking on it, +1, Hotmail and Yahoo! set the bar.
Dean J
HoTMaiL. HTML Mail. These guys were ahead of their time...
Spence
+16  A: 

AJAX and JavaScript Libraries were the next big thing for me when I started getting serious about web programming. I remember scratching my head a lot trying to get AJAX to work myself, getting really frustrated and almost giving up.

Then things like Prototype and JQuery came along and made everything awesome :)

Ganesh Shankar
This happened to me a month ago.
Tchalvak
+15  A: 

Event driven programming.

I definitely didn't understand it when I was starting out, but I occasionally fondly(?) reflect back on trying to implement the buzz-words before I really understood them.

Tanzelax
.. and just how do you "implement" a buzzword :) ?
bobobobo
That would've been the problem... implementing buzzwords instead of implementing the concepts. :p
Tanzelax
+32  A: 

The personal computer.

Ignacio Vazquez-Abrams
Yep, I still have my Interact and my Atari800.
phkahler
+4  A: 

.NET definitely, and I was defiant to the end. I finally installed it a couple months ago when I needed it to run an app that would jailbreak my iPhone (yeah yeah in before "hipster").

Also Napster.

some_schmoe
+16  A: 

I guess CD-ROM's - everybody was creating them thinking they would be millionaires.

Ravi Wallau
Is this thread just supposed to make us feel old? Because I remember writing HyperCard stacks to interface with a laserdisc player in the late 1980s ... before I ever got my hands on a CD-ROM. And then you had to figure out where someone left the !!@#%ing CD caddy so you could actually put the CD in the drive.
Joe
Now its apps for the IPhone...
Philip Kelley
+7  A: 

When I started nsapi and vrml was the next big thing. And also corba.

fastcodejava
I remember corba!
JMarsch
... and I remember VRML!
Andrei Rinea
+3  A: 

Agile development methodologies.

David Johnstone
+22  A: 

Pascal.

In college they got a Pascal distro and had to figure out how to build it. All you got was the Pascal source. So you had to use that source to write a Pascal compiler in some existing compiler (we had Algol-W and Sail), then compile Pascal using that bootstrap compiler.

Open-source circa 1975.

S.Lott
+1 for the "pre-gcc" way of building compilers
Earlz
It's truly, truly a shame that C++ kicked off as *the* language to know rather than Pascal...
BlueRaja - Danny Pflughoeft
+68  A: 

Geocities. Geocities was the thing. Everyone had to have their own homepage.

Edit:

The above answer, which is my original, really isn't a programming related answer. I will expand upon this by saying Geocities got almost all of my friends to start learning HTML. When I say almost all, I really do mean both programmer and non-programmer alike. For a few weeks I swear talking geek was in and then suddenly...

wheaties
Some people had like 3-4... then bragged about their animated-background-enterprise.
Zombies
+1 That's so true for me. My first website ever was on Geo :P
Kyle Rozendo
You're only truly old school if you registered your first GeoCities page when they were still taking the whole "neighborhood" metaphor a bit too literally and had index pages that showed rows of houses, some of which were vacant and available to be "moved into".
Tyler McHenry
@Tyler McHenry: I remember those pages, that was a horrible metaphor that confused me greatly about how things worked back then.
Tanzelax
@Tyler: I was there... I've seen it... and now it cannot be unsen.
Stefano Borini
Rhymes with "atrocities," amirite?
Derrick Turk
Wow, before this comment I didn't realize that that was what happened to me, but I did indeed get my start writing html because of a geocities page. I owe geocities a great debt.
Tchalvak
dito @Tchalvak.
Kenneth J
Sheesh. Likewise. The shame.
Paul D. Waite
+2  A: 

When I first started, .Net(1.0 came out a few months after I began programming) but most of all CD-Rs

Suddenly everyone could burn their own CDs cheaply and every new computer had a CD-R burner. This lead to a huge surge of (illegal) CDs copied, both music and software. And it seemed like Floppies were finally on their way out in the near future(with stuff like Roxio's DirectCD allowing seemingly random-writes)... And then Flash drives came out and killed both of them.

Also, High Speed internet everywhere. Finally we got some (relatively slow compared to today) DSL where everything went zip-zap fast. (note, I lived in a rural area)

Earlz
I also remember whenever people said you had to hide CD-Rs because record companies were out looking to sue people who made copies and that if a car repair shop saw them, they were supposed to report you to the police.. back in like 2002 rofl
Earlz
+59  A: 

Bulletin Board Systems.

I feel old. :-)

Dean J
I remember the year I discovered BBSes ... and then, when I went to college, and there was the internet, with NNTP and MUDs, and you realized what a small little thing the BBSes were. (and I suddenly realized how there were so many people posting on FidoNet message boards).... and I've been told I'm not old for BBSing, as I never used an acoustic coupler -- my first modem was 2400 baud.As I hear there's 'apps' on Facebook on the like, maybe I'll join if someone will bring back some classic BBS Doors ... wait, no, I've lost too much of my life to TradeWars
Joe
@Dean J and Joe: first model here was a 2400 baud too... Woaw, we're a bunch of old dudes.
Webinator
Don't feel bad, I was a cosysop in the early 90's and I'm still in my 20's.
tsilb
BBS also thought me what long distance phone calls were! Boy were my parents unhappy at the bill, I think it was around $2000.
kruczkowski
FIDO rocked in it's own odd way. I remember when my 9600 baud modem board was a big deal.
S.Lott
Aw damn, the days when internet access cost hit the phone bill, good reminder of how good we have it now.
Tchalvak
300 baud, acoustic, tape hard drive, Pirates of Puget Sound
alchemical
@Joe - man, tradewars was great...
Erik Forbes
LORD (Legend Of the Red Dragon) was the only BBS game I really played (internet happened shortly after). Or was that one too recent for you folks?
MGOwen
@MGOwnen: LORD was a lotta fun; any memory of Barren Realms Elite?
Dean J
I'm having flashbacks. I used to run a WildCat BBS, at the end, with TradeWars running multi-server over fidonet... ah the days of DESQview and memory optimizing...
qor72
FidoNet and BBSes in Portland, Oregon, and my very own usenet point hanging off of PSU. Those were the days. I was a lonely child. BBS's SMC, Realm of Infinity, and the TechBooks site. Sigh.
andersoj
+45  A: 

Stack Overflow... Not when I was just starting out, but when I was starting college.

Josh Curren
Stack Overflow is what I wished existed when I started programming
Earlz
Stack overflow is one of the few things that has managed to totally adjust my workflow and take it to a new level. Also to learn and correct my beginner mistakes, for me, it's one of the best things that has happened in my Programming/Engineering career.
Mr.Gando
From the viewpoint of a pre-Internet developer, search engines are the killer app that makes our lives remarkably easier everyday.
Dean J
Yea Stack Overflow would have made a huge difference when I was starting out. Just having blogs would have been awesome. Damn am I old?
chubbard
I have learned programming basically in a vacuum for 3 years, and then for the last 6 months I've drastically accelerated my learning with SO. There have been so many $head->table errors that someone else could have caught over my shoulder, or basic beginner's questions that take a knowledgeable person 2 minutes to explain, but to research on your own can be entirely fruitless that I've almost given up a number of times. Think about the value you're all adding to the community next time you say to the next cubicle "hey, what's the best way to do XXX?". **sniff tear**
Alex Mcp
yes...I distinctly remember carrying around a C reference book in college and having to lookup all the functions. Kids these days got it so easy with their googles, SO, and fancy editors. Crazy if you think about it, that was only 9 years ago.
dotjoe
+1  A: 

Well, I haven't been programming for too long. But flash drives were becoming more accessible when I started programming.

Wallacoloo
What year was this?
Earlz
Sometime around 5 years ago. I don't know the exact year that I got that flashdrive, though I still have it :D I just checked, and it's actually 256KB. I haven't used it in a while, I got a 1GB flashdrive 2 years ago to replace it.
Wallacoloo
Are you sure it was Kb, not Mb. I don't recall any flash drives that held less than 1/4 of a single 3.5" floppy disk.
ScottS
Gah! My bad, I was in a bit of a hurry. I checked and it is (not so surprisingly) in MB.
Wallacoloo
+13  A: 

Colour monitors

gnibbler
@gnibbler you mean color?
i am a girl
@jenny, no. why?
gnibbler
+4  A: 

Create big bold and colourful HTML pages. Looking back now they actually look quite awful.

Creating dynamic web pages using CGI and perl scripts. Some of the process involve splitting files and merging them back with dynamic content being written, parsing and replacing of text.

Fadrian Sudaman
+3  A: 

Punch cards. Then 5 1/4" floppys.

thekaido
Punch cards? Luxury, i still miss mercury delay lines.
PurplePilot
@thekaido: I wonder where you were? I associate punch cards with 8" floppies - but only on high-end machines.
John Saunders
+5  A: 

Laplink

Then

9600 baud modems

Then

Trumpet Winsock and this thing called the world wide web.

asp316
+1 for Trumpet Winsock!
Kyle Hodgson
+4  A: 

The abacus.

LeopardSkinPillBoxHat
About a year or two after I left they put a "very" accurate sun dial on the lawn of the of the Engineering Quad. Seriously.
When I was in school ('73 to '77) about half the engineers carried slide rules and half used calculators.
Wow, you must be really old :P
Ch00k
+1  A: 

BBSes, Personal Computers, memory expansion cards, multi-player turned based games over modem or BBS.

Jacob

TheJacobTaylor
+2  A: 

PL/1, from IBM (circa 1970). It was Algol, FORTRAN, COBOL, and TheWaveOfTheFuture, all in one package. They threw in a lot of other junk as well. No company other than IBM could develop such a monster, and no one else used it. I think a lot of people were impressed, in the same way they had been impressed by the 1959 Cadillac Eldorado Convertible. The language did introduce some new concepts. The idea of stream I/O (as opposed to structured record I/O) is still with us. But the language itself had as many "features" as the C++ STL has objects, and it died under its own weight.

gary
I asked my Programming Teacher what language he liked the most and he answered "PL/1" because "It is math oriented and you can do anything in it."
Earlz
Ask him if he ever used the line-control trick, umm feature, to underline procedure names. A bit of work, but it looked cool. Remember, these were the days of punch cards, and upper-case-only language text. Did your instructor also like the "dream boat on wheels" Caddy? I guess that's not quite the same, because that wasn't a "do anything" vehicle. Like, you couldn't parallel park it.
gary
+1  A: 

As I am a .net developer. When I started programming the important tech was:

  • XAML
  • WPF
  • RIA
  • Ajax
Nasser Hadjloo
+3  A: 

OO. It was going to change the world

jayray
and it pretty much did.
Zaki
A: 

Automation. There will be automated software designing, develoment, testing and maintainance. Don't know who will do this?

kapil
+42  A: 

The Commodore 64

pdr
Back when we rocked with just 64K of memory.
JB King
When I was in my mid-teens, and owned an Amstrad CPC 464... The next big things were the Amstrad PC1640, the Amiga, Acorn Archimedes, before finally, the humble 286!
James Morris
Yes yes - when I was a kid I learned to program basic by copying code from compute! magazine. I miss those days, it's easy to say that basic with linenumbers, gosubs, peek/poke is complicated as compared to say coding in python or something - but a person could understand the entire architecture back then. Still have my c64.
jskaggz
And you dreamed about one day owning a 128. Without not really knowing why. Because cp/m wouldn't probably make much sense to you.
jishi
@jishi Not really known why?? That wireframe game graphics of course! (i actually mean some combat flight sim which needed 128k but (according to what i read) was *awesome*) :))
mlvljr
ah, when syntax meant everything!! and there was no such thing as 'copy and paste' so you typed it like it was in the manual and some of the characters in print were ambiguous, a big difference when you had 100's of lines. There was no debugging. There definately wasn't any 'intelesense' There was no internet to look up how to do something. Experimenting wiht Peek and Poke... Just the fact that the characters were smaller and you could read more text on screen than on your Vic 20 was a big plus for getting the 64 or 128 for that matter. Just as Archey Bunker used to say, "Those were the days."
Chris
@Chris, you've got me all teared-up here. Remember hours and hours of carefully copying out DATA arrays just to get a 16x16 sprite. Just. One. Tiny. Error....
pdr
L-shift O which looked as a corner from a rectangle to load a file. if you wanted to load from a RERAL FLOPPY DISK: load "xxx",8,1. Those were the times.
Ritsaert Hornstra
C64 was my first computer. I remember when I got it for Christmas complete with the 5.25 floppy and I got a cassette drive from somewhere. I remember the "Press play on tape" prompt. I wish I still had it, if nothing else it would be fun to show the kids when they get a little older. (I think my 4 year old's Leapster handheld game has more power)
AdmSteck
+4  A: 

Microprocessors. I still remember buying my first "computer": a 6502 SDK with 4K of RAM!

Stephen C
+26  A: 

I started programming in FORTRAN during the fall of 1971. Back then, things like Unix and C hadn't been dreamed of, and the only OOP language was some weird Norwegian research thing called Simula.

Some of the "next big things" were:

  • 4-bit microprocessors.
  • IBM/370 mainframes. The notion of such raw computing power made us all giddy.
  • Terminals and time-sharing. It was almost unimaginable to think of typing in a program at a terminal and getting it compiled within minutes, as opposed to handing in a card deck and coming back a few hours later to get a thick printout to find out what typos you'd made.

Then when I was a senior in high school, in 1975, a guy I knew and his dad paid a ridiculous amount of money and got an Altair. That was a kit microcomputer with an Intel 8080 8-bit CPU. It didn't really do much, but we knew the world was never going to be the same.

Incidentally, my dad insisted I not study programming in college. He was certain computers were just a fad.

Bob Murphy
+1 for computers are just a fad..
Earlz
Tell us about the war, grandad! :)
U62
So one of the hot new things you were in on was the logical IF statement, the first FORTRAN conditional construct that wasn't some more or less weird form of GOTO.
David Thornley
@U62: Get offa my lawn or I'll whack you one with my C++ template for auto-generating OpenGL/ES shaders on embedded ARM! Damn kids these days, no respect for their elders. And they think txting is new. We used to txt back in the day by gnawing holes in a punch card and passing it around when the teacher wasn't looking.
Bob Murphy
@David Thornley: Actually, logical IF started with FORTRAN IV. We were using FORTRAN II, which only had the weird trinomial GOTO form of IF. It also had computed GOTO, which is to structured programming what a bottle of Stoli is to sobriety. You could do some twistedly incomprehensible things with computed GOTO.
Bob Murphy
IF( A .LT. B ) 1, 2, 4. Loved it.
S.Lott
@S.Lott: Actually, `IF (A - B) 1, 2, 4` (whitespace to taste, including spaces inside the `IF`). Ah, the arithmetic IF, computed GOTO, assigned GOTO (which I was always forbidden to use). Fun, fun, fun.
David Thornley
+7  A: 

Programming sites for mobile phones... they used to call it wap (and WML - Wireless HTML)

We were all building sites for old phones for companies that vanished in the 2000 bubble

LiorH
Yes - I remember the fuss over WAP and WML - till anybody tried to do anything actually worthwhile or interesting with it!
Gordon Mackie JoanMiro
WAP was never going to work people always wanted the real internet on their phone.
PeteT
WAP was trying to re-invent gopher. I offered to bring back our gopher server (which we had only off'd a year or two before) and create CGI frontend for it when my manager brought in consultants who were supposed to convert our whole website to WAP.
Joe
GPRS..... gasp!
Matt Joslin
+2  A: 
  • The Web (but not the Internet, that was already there). I remember using gopher.
  • C++ Templates (but implementing them efficiently took some time).
  • Java? No, the Java precursor ("Oak") hadn't hit the public awareness yet.
khedron
I remember gopher. And Archie and Veronica and Jughead. Back then, naming products was an art form.
Bryan Oakley
+46  A: 

Google Search Engine. Changed the whole game!

Usman Akram
@Usman - Yup, this was it for me. I spent so many hours just searching and reading (mostly how to install Slackware as it was the only thing I could download on my 28k connection in a sort-of reasonable time). Oh man, that was crazy.
JasCav
? wouldnt call it a game changer, many search engines existed already...going back to veronica and gopher.
alchemical
@alchemical: Yes but back in the day dogpile and infoseek just couldn't quite do it right
BlueRaja - Danny Pflughoeft
+1  A: 
  1. Thin client
  2. Service Oriented Architecture
  3. On-demand
Roland Bouman
+24  A: 

Ray Tracing.

As a graphics guy, Ray Tracing has always been the next big thing in graphics. I actually did some of this in BASIC on an Atari800 - yes it took hours. Several years back I was asked by someone over email when it would be practical for real-time use. I estimated 2012 which interestingly matched another persons prediction. I can almost stand by that now - it should be quite interactive on those Bulldozers AMD is going to have in 2011 (and presumably on whatever Intel is up to) and moving to "playable FPS" shortly thereafter. Once practical it will certainly be the "next big thing" in graphics for real, as it does everything with elegance and simplicity.

phkahler
I've heard of some demos that do real-time ray tracing already
rmeador
+1 for having an accurate prediction
RCIX
@rmeador - yes, I've done some real-time ray tracing myself. You'll see it full screen with general purpose libraries in a couple years assuming Moores law continues a little longer.
phkahler
So it will become "practical"; but will it eventually *look better* than the current way of doing things will on the same hardware? (not talking about accurate reflections on screwheads, noticeable stuff)
MGOwen
@MGOwen Raytracing has the potential to look far better than anything we've got. You're probably referring to the old SIGGRAPH raytraces of spheres in a land of cubes. Google "monte carlo raytracing" to see just how good this stuff can look. Only problem? Monte carlo raytracing is 4x more expensive than 1980s raytracing.
Frank Krueger
@Frank Krueger Monte Carlo does look cool, but my point is that ray-tracing won't be widely used for games until it creates a *better looking* result with the *same hardware* (or same cost of hardware). That was my question - will it? I think I heard somewhere that raytracing scales nicely with multiple CPUs, making it possible that at 8 or 16 cores RT will finally produce a better result than traditional, this answer is currently missing any actual info/reason such as that for the claim.
MGOwen
+1  A: 

Client Server ring a bell?

Zombies
+4  A: 

When I started programming? Structured programming.

When I got a job programming? Object reuse, STL for C++

HerbN
+19  A: 

The Apple ][e. High res graphics! Built in BASIC! That thing ran at 1Mhz! Boo-yah!

Beska
+1, only for me, it was the Apple 2+ with 48k!
Matthew Flynn
I wish I hadn't gotten rid of mine.
David
Fond memories...
Steve
Except then no one said "Boo-yah". Maybe it was "Huzzah" then?
aaaa bbbb
Apple ][+ with 64K (the 16K add-on board) so I could run UCSD P-System.
S.Lott
back then it was Duo-Drives on my ][e, now it's a Drobo on my custom, hand-built, butt-stomping rig
Hardryv
A: 

Ruby. I'm young, OK? I kept seeing it on DZone and then met one of RubyConf's organizers. All the hype just pushed me to other new languages such as Clojure.

justkt
+1  A: 
  • UML
  • RUP

ugh...

nos
But hey! You could "generate" code from design... Right?
Chetan Sastry
A: 

When I was in College, Haskell was brand new (Not widely distributed) and Java was JUST over the horizon, and people were waiting on them excitedly...

HTML had recently become bigger, Mosaic was just out, and sites like "The Big Red Button that does Nothing" and the genetic art project (anyone remember those?) were big.

Personally, I was just excited to be able to use usenet, but it wasn't new at the time, just new to me...

Brian Postow
A: 

When I started programming, the next big thing was 3-tier development.

It represented a departure from the client-server model, particularily focused on removing the logic from the DBMS in order to increase scalability and portability.

Of course, the tools at the time (on the Microsoft world, at least) were less than optimal for this (people that had to deal with COM versioning will confirm what I'm saying)

Diego Mijelshon
+3  A: 

let me think... 16 bit microcomputers, I think. Compiled languages for home microcomputers came in at about the same time I think. Oh, the excitement when Sierpinski's Triangle would take 10 minutes to draw instead of 30, when rewritten to Turbo Basic!

SF.
Compiled languages for home microcomputers were somewhat earlier, but they were really awkward to work with until Borland came along.
David Thornley
+13  A: 

64k of memory on an affordable home computer. What will we do with all of that?

Joe
Write BASIC programs that fit in `38911 BYTES FREE`!
Mike DeSimone
+1  A: 
  • Ruby & Python
rlb.usa
+4  A: 

HDDs for IBM PCs - I remember getting my first 10Mb (yes 10 whole megabytes!) on a card that slotted into one of the expansion slots and fitting it myself. Long time ago now - OMG! - must be quarter of a century.

Gordon Mackie JoanMiro
+2  A: 
The only thing I need to add to that is Parallel Processing. Up in the penthouse of the CS building they had a rack full of those newfangled microprocessors. IIRC, they were mainly interested in how to manage the work for such a beast rather than practical applications, although they did have it generating Bach-like music on a speaker.
Hugh Brackett
+3  A: 

CASE (Computer Aided Software Engineering) was the first next big thing for me.

Bryan Oakley
I remember this one too. What a floperoo! Very few of these tools made it to market and the ones that did churned out awful, sloppy, buggy code. FAIL!
Bob Mc
+11  A: 

Rapid Application Development (RAD) which if I remember correctly was the concept of dragging and dropping components onto a form.

I started with Delphi 1.0 and the latest version I used was 2005. Much easier to write native Windows apps than C or C++.

Dennis Palmer
I still have my copy of Delphi 7... :) It's a shame they no longer offer the Personal editions.
Lucas Jones
I loved Delphi! Used it from the initial beta on the Borland Partners program through version 4 (when I couldn't find jobs for it). I even had my Delphi Client/Server Certification. Great tool, and I didn't stop missing it until c# came around.
JMarsch
+1  A: 

Maven - definitely

c0mrade
+1  A: 

WAP / WML

Fortunately I didn't waste much time with that.

Bruno Rothgiesser
+5  A: 

For me ? Design Patterns, it's still a work in progress, but they where the next big thing that allowed me to take my design skills to a new level ( which I'm still refining of course ).

It's that and Peer to Peer distributed Networking, forgot to add this when I first replied, but for me right now P2P Networking is one of the "Next Big Things", and it has really made some huge huge changes in the scope of File Sharing (and the world's vision to it), but believe me, we will see much more applications to P2P than simply File Sharing.

Mr.Gando
Wasn't the early internet more P2Pish than Client-Server?
Earlz
A: 

CASE tools I think were big hype at the time.

Tim
I remember those !
bigtang
+1  A: 

16-bit programming in Visual Basic 5! I remember freaking out at how cool intellisense was.

Terry Donaghe
VB5 was 32-bit only. VB4 came in both 32 and 16-bit.
Chetan Sastry
D'oh! Read what I meant, not what I wrote! lol :D Good catch. I knew it felt wrong when wrote it.
Terry Donaghe
+4  A: 

DOS, seriously.

I started programming on a learning machine. The machine was missing many software and hardware components comparing with a modern computer. Every time it's turned off, the source code is gone. Every time it powers on, the interactive BASIC environment is setup waiting for your input like:

10 A=2:B=4
30 PRINT A
RUN

I had a lot of fun and actually learned a lot with it. Later, when someone introduced me a real stuff with a 80286 inside, I was so confused with the reason why I need to learn DOS because it appears nothing related to programming;-)

Codism
+1. I first learnt programming the same way. Must find my old code scrapbook where I would transcribe all my code - effectively my hard disk :P.
MAK
+1  A: 

MFC (Microsoft Foundation Classes) and Windows 95

chburd
For me, Win95 was all about those bits from WinG that wouldn't work in Win32s. Work was Ada. Windows and C was fun. Yes, seriously.
Steve314
+1  A: 

Java. Though this question is sort of hard to answer because often when you're just getting started, you don't know what the next big thing is, because you're too busy catching up on the last few big things that actually made it.

jsn
A: 

I remember taking my mock Computer Science A-Level and having to discuss whether Unix was going to break through into the mainstream and my teacher getting very excited about the potential in Lotus 1-2-3. I guess they both went the same way.

amelvin
Lotus did break through into mainstream for a while... then I believe it continued on in IBM - Lotus/Domino is still used in financial nooks and crannies...
Taryn East
And I'm pretty sure I've seen that "Unix" thing you mentioned around the internet here and there.
P Daddy
Unix isn't quite mainstream in the desktop world (but then again, you have linux and OS X is BSD under the hood), but it is mainstream in servers, and probably already was at the time you speak of.
MAK
My A-Levels were a while ago, I remember singing 'That Joke Isn't Funny Anymore' with a couple of mates as we went into our Pure Maths A Level exam - as The Smiths were starting to break into the top 20 at the time.
amelvin
A: 

.NET 2.0 assembly support in an application that I was writing scripts for.

Learned .NET to take advantage of the power unavailable in the scripting language.

Aequitarum Custos
A: 

UML and flowcharting. Soon you would not write any code you would just diagram it out. UML good for some things not that good.

rerun
+2  A: 
  1. Megabytes: The first machine on which I did anything that could be called programming had 512 KB (with a K) of RAM. My web browser (just one program) is using over 300 times that amount of memory just to let me type into this little box.

  2. 9600 baud: My first modem was a 1200 or a 2400, can't remember.

  3. America OnLine: It was big stuff back then.

Seth
I remember working on a machine with no hard drive at all.
HLGEM
@HLGEM - the machine in question was a Mac 512KE with an internal 800K floppy drive. The first (20MB) hard disk came much later with our Mac Plus (I still have it :P).
Seth
The first computer I owned had 16K of memory. I don't know about the computer I first programmed on, since all I saw was the teletype.
David Thornley
+1  A: 

I believe that the next big thing was going to be the language to replace Fortran for number-crunching. That language remains the next big thing. And that language is still called Fortran.

High Performance Mark
+2  A: 

1200-baud modems. Those things were FAST!

Kristopher Johnson
+3  A: 

Windows as a serious gaming platform (thanks to DirectX). Learned all the mode 13h tricks, even got SVGA figured out, wrote some little DOS games, all the while rest of the world slowly switched to Windows. But it was a lot of fun, a time where every game feature was a clever hack to make it run fast enough and my computer was a blistering 486 66MHz. Haven't done any form of game programming in ages but DirectX, although an essential and great introduction, was never quite as fun as mode 13h!

Matt Inglot
A: 

using cms like wordpress

radi
+2  A: 

COM objects and .NET Framework. Good times ;)

a_person
+2  A: 

VB6 Webclasses were just announced and the web would never be the same ;)

(to be fair I had been programming for about 5years at the time)

keithwarren7
+2  A: 

Write Once, Run Everywhere

kanchirk
A: 

32-bit processors. And then with built-in math coprocessors!

Chetan Sastry
I think you have the backwards, werent co-processors on 16 bit CPU's?
Neil N
386 didn't have built-in floating point processor. The first CPU with built-in coprocessor was 80486DX.
Chetan Sastry
+13  A: 
  1. 8" double-sided, double-density SOFT sectored (!) diskettes ...
  2. M/PM, the multi-user C/PM ...
  3. 16-bit compilers !
  4. WordStar 3.31
  5. Dual disk drives, which meant your program(s) could live on one drive while your data existed on the other ! Expensive, though ...
  6. 64K of RAM, with 256K on the event horizon !
  7. Microsoft's Fortran-80 for CP/M !
  8. SuperCalc ! Still the best spreadsheet ever ...
  9. GUIs ? Strictly for wussies ...
  10. Really programmers 'listened' to the disk drives to diagnose poor programming practices ...
  11. A cookin' computer cost $4,300 with two drives, a Z80 CPU, and a massive 64K of RAM. Plus keyboard and monitor. The Zenith Z-19 was nice ...
HaroldTheBear
+1! Regarding #10- I still remember the distinct * Grind Grind Grind * sound before the "`Data Error Reading Drive A. Abort? Retry? Ignore?`"
Chetan Sastry
11. Heck, I wound up paying less than $3000 for mine, although it had only 48K. That was the most I ever spent on a computer, even not adjusting for inflation.
David Thornley
+1! Programmers listening to hard drives to diagnose poor programming practices :D
Buzzy
There was a demo back then that imitated the sound of a washing machine by starting and stopping the floppy drive motor.
ninjalj
@Buzzy - not hard drives! 51/4" floppy disk drives... no hard drive at this time. The first one I saw was a box 40x15x20 cm with 5MB on it - that was huge lol. More expensive than the computer (Apple ][) driving it.
laurent-rpnet
+1  A: 

Multicore processors...

Helper Method
+2  A: 

The big thing at the time was how to use the CPU inside the Commodore 64 1541 Floppy drive for additional computing power (it contained the same CPU as the C64, so it sped up Mandelbrot calculations a lot).

The next big thing was the Amiga and Atari ST...

Michael Stum
I had a Commodore +4 with an uselessly unaligned 1541, never dreamed I could parallel process (didn't even know what it was)
Arthur Kalliokoski
+1  A: 

Java was real hot in the late nineties. Then in the beginning of 2000 it was all about XML.

marko
+2  A: 

BASIC and time-sharing (using one computer to serve dozens of teletypes scattered around).

David Thornley
+3  A: 

Yahoo! Google still didn't exists and Yahoo! was THE portal in the internet.

I remember thinking on how cool would be to work at Yahoo! after finishing the school.

OscarRyz
+2  A: 

Copy-paste philosophy.

Harun ESUR
Do you remember "Paste special"? (Excel) ...
Andrei Rinea
A: 

Object oriented programming

CASE tools

A bit later: Apple's Taligent

+1  A: 

Windows 95 was about a year away at the time.

Of course I started on a TRS-80 so I didn't know at the time.

Joshua
You were using a TRS-80 (released in the 70s) in the mid 90s???
Steve314
Yes, already antique (free). The manual that came with it had the best introduction to programming I have ever seen.FYI, my model was built in 1980 and the software supported years 1980-83.
Joshua
+1  A: 

Some of the new big things that I crossed:

  • Internet in most schools/homes
  • Free Internet with NetZero
  • Altavista was THE search engine
  • Windows 95
  • Windows XP (at last a good/easy OS by Microsoft)
  • Java
  • .NET
  • All the Web 2.0 stuff (including AJAX and friends)

Edit: Added some below:

AlexV
+6  A: 

WAP on Nokia 8110 'matrix' phones and XML. I hate both still to this day :D

Richard Reddy
+4  A: 

I started programming in 1983 on a VIC-20. The C64 had just come out, but unbeknownst to most, the Mac and the Amiga were both right around the corner. The Next Big Thing had to be the GUI.

Payton Byrd
+4  A: 

Wikipedia. Maybe not the most programming oriented item but it certainly has helped!

The Jug
+3  A: 

Java applets. I had people begging me to put silly applets on their webpages pretty much as soon as they found out I could program Java.

Casey
The same is true of Flash today
finnw
+2  A: 
  • Compuserve
  • Structured programming
  • Modula-2
  • 16-bit processors (8086 vs. 68000 vs. Z8000)
  • Hard drives on microcomputers
  • CP/M 86
  • 5.25" floppies
Jay
Did you use JPI Modula 2? I think is was bought out from Borland, and had an IDE similar to Turbo Pascal 4.
Steve314
oooo, hard to remember...JPI sounds familiar, not sure if that was the product I used or something else. Mainly just played with it, I was a Turbo Pascal devotee.
Jay
+2  A: 

Try learning how SAAS (Software as a Service) works. More and more companies are moving into this approach.

Jojo Sardez
+1  A: 

The concept of "Write once, run anywhere."

The idea was that you'd write your code once and it'd run on any machine anywhere thanks to a virtual machine. Back then it was the early Java VMs. These days it's Javascript in a browser.

Olly Hodgson
+1  A: 

The next big thing isn't going to be anything in the computer at all. It's gonna be apps on portable systems like Iphones/Palmtop/Mobile phones for that matter.With the onset of 3G bandwidth is of little concern which was an issue previously.

Why is this going to be a big thing ?

Because most of the corporate managers are mostly on the move and would prefer having any new application to be accessible from their mobiles rather than their cumbersome laptop. This is coming from the fact that quite a lot of companies have Iphone business apps lined up to be released.

Mulki
+1  A: 

XML

In my first job almost everything I suggested was met with the response, "can you use XML for that?". Since then, I have used a lot of XML which I guess counts as a successful next big thing.

flamingpenguin
+3  A: 

When I started the next big thing was Aspect Oriented Programming.

egarcia
I looked at Aspect C++ once. The first few tutorials seemed quite good. Then I saw some "real" aspect-oriented code, and I was reminded that "when all you have is a hammer, everything looks like a nail". The aspect-oriented crowd most definitely had other tools at their disposal - but they seemed determined to use that one particular tool for *everything*.
Steve314
+3  A: 

year 2k: XML, everything could be solved with XML...

Ingo
+1  A: 

Im quite new in the world of software and development but I think the latest craze is content management systems with the top being Drupal and Joomla, maybe Wordpress.

Id like to see more jsp (java) cms's and .net that can do what drupal does.

Maybe ill re-write drupal in jsp!

Andrew
Why would you rewrite Drupal in jsp?
Toby Allen
+3  A: 

The 1GHz processor - I was so pleased AMD got it out before Intel :-)

Andy Shellam
AFAIR it was IBM who made the first prototype 1GHz chip, using PowerPC architecture.
Tadeusz A. Kadłubowski
+3  A: 

Software-wise, Turbo Pascal 5.5. It was a breeze.

Hardware-wise, computer mice and color monitors. Good times.

RegDwight
+4  A: 

I remember the XML hype getting a boost by the introduction of XSLT. I seem to remember that there was some vision about having a unified format in XML that you then translated to different formats for different browsers (such as early mobile phones). Unfortunately, tool support was very lacking.

waxwing
+3  A: 

Semantic web (technologies)

Gnark
+1  A: 

All the Web 2.0 stuffs and AJAX.

Ryan Liang
+18  A: 

Loading programs from audio tapes (ZX Spectrum).

Once I even attached a microphone instead of a tape player and tried to imitate sounds to see if something would load to computer. Sadly nothing did. But I got those carrier sync lines running on the screen though!

Developer Art
I love it. As a young lad, I tried connecting a tape recorder to a TV antenna cable and see if I could record a TV show, or actually: all of them, at once. Little did I know about (frequency) bandwidth...
bart
+2  A: 
  • Capability Maturity Model (CMM)
  • Team Software Process (TSP) & Personal Software Process (PSP)
  • UML
  • Case Tools
  • And of-course Write Once, Run Anywhere
Kashif Awan
+4  A: 

Jon Skeet was born.

Milan Ramaiya
+2  A: 

EJB Entity Beans with Container Managed Persistence (from EJB 1.1)

rlovtang
How I still hate EJB to this day!
The Elite Gentleman
+9  A: 

Visual Basic for Windows, now everyone can program in a natural language with a graphical interface.

Beaner
-1 for the 'now everyone can program'
Charlie Somerville
@Charlie - you don't remember how it was advertised, or were you just not there?
Beaner
@Charlie - that was really the ad tag line. A huge improvement over needing a full IT department with PhDs.
DaveE
@Beaner Oh, in that case, I'll remove the downvote if youmake a small edit. I can't undownvote without it being edited
Charlie Somerville
A: 

JBoss Seam framework and all its sweet magic with conversations and web apps

BeanShell, Oh! you type and it execute? ... ruby flavor ñum ñum : D

LazyLoading As sweet as an ex. you could love her. but you still knowing what happened last time.

Agile... Geez.... Agile!! YEAAAAHH!!! this is the PM style I was looking for sniff... snif... :_ )

LambaInC# fingers I have a good notice... : D

StackOverflow Where I discover how far you could go : P

SDReyes
+4  A: 

Mr. Bojangles, animated. TI-99/4A BASIC. But I was only 7 or 8 years old at the time. My mom might have been skeptical, but I believed. I can't say working on Mr. Bojangles changed the industry, but it did change me.

JasonTrue
+6  A: 

I remember everybody talking about this new "Java" thing.

Tom
And I remember everyone saying what a bad idea they thought it was (I did not agree though.)
finnw
+3  A: 

It was about three or four years after I started, but the first next big thing that I considered my "next big thing" was the dBase Professional Compiler from Ashton-Tate. That software was going to turn my career on its head...if it ever materialized...which it didn't. So, I responded to an mailing from Phillippe Kahn, then-CEO of Borland, offering Paradox for DOS (retailing at $795) for only $99. I bought it, got good with it...it transitioned me into Delphi...which transitioned me into .NET...and here we are.

Neil T.
A: 

When I did my first steps with computers and programming... Multimedia, Amiga, AMOS and Blitz Basic :) My first computer was an Amstrad 1512 in 1987 or 1988 (not sure), I was 13 and wrote my first BASIC programs on this computer. An Amiga 2000 with 2Mb Ram + 100MB HD a couple of years later - boy what an upgrade!

When I first got my real job, next big things out there were Google, followed by mysql 4.1, php 5 with its way better OOP support, .Net (which I never embraced).

Spiros
+2  A: 
  • Turbo Pascal for DOS.
  • QuickBasic (I remember playing the infamous gorilla.bas and it came with sound).
The Elite Gentleman
Whoa.. that gorilla.bas was awesome.. the base for many of my first small games I coded in basic.. Those were the days..
Arcturus
+7  A: 

For me, the consumer Internet. Amazon, eBay... basically the internet as a utility.

The Internet existed, sure, but I first really got into computers in the mid-90s just before it finally took off with regular people. When I was in elementary school, only the rich kids, and the kids whose parents' worked with computers, had computers at home. My best friend's dad was a sysadmin at the local university, and introduced me to the internet. Before then, a computer had a green screen and was only useful for playing Oregon Trail, as far as I was concerned. By the time I was in high school, EVERYONE had a computer at home, most had cable modems, and AD&D had been replaced with LAN parties[1] as the nerd social activity of choice.

These days, the Internet is taken completely for granted, like electricity or water. When I was in elementary school, it was a toy for rich kids and nerds. We didn't look things up on Wikipedia, we popped MS Encarta into the CD caddy. The internet tied up your phone line, and you got charged by the hour, so if you used it you better have a good reason, and be done quick. If you were going to be reading a long article, you would log on, download it, and then disconnect and read it offline. To do otherwise would be like leaving the water running while you're brushing your teeth.

[1] When we wanted to play a multi-player game, we couldn't plop down in front of the TV and get on XBox Live with someone in another time zone. Even if we could, we would have to resort to typing insults at each other---our computers, connections, and games couldn't handle headsets. We had to unplug the computer and schlep it over to Timmy's house, where we would run extension cords all over the house to avoid blowing circuits.

Adam Jaskiewicz
I remember a gathering of 10 people, all with their own PC's and 15" CRT monitors crammed into a single bedroom flat. All windows were opened and it was still like a sauna in there.
graham.reeds
+5  A: 

When I started, the next big thing was Windows. I still remember my first boss telling me it would be a "fad". :)

Hank
+3  A: 

Way to make me feel old, guys (er, and possibly gals too - be interesting to know the ratio here!)

But what I really longed for when I first got started was non-linear storage. Yes, my first year of programming, everything was stored on paper punched tape. Let me tell you, when I got to use a card punch machine a couple year after that, it was heaven!

chris
+4  A: 

COM, and yeah.. the Internet. Then XML. It was a dark time, other than the internet.

royal
"How can we use this 'XML' thing in our software? Sounds really neat, I read an article in <some random management magazine> about it." I remember that meeting well.
nathan
+3  A: 

I started out in IBM mainframe land, and IBM thought the Next Big Thing was clearly 370XA (Extended Architecture) to boost the address space from 24 to 32 bits. Because naughty programmers had found all sorts of uses for those "free" 8 bits, converting the operating system took forever.

Of course, this was like asking what the Next Big Thing in dinosaurs was. At the time I was starting out, the real Next Big Thing was C. I didn't learn about C until years later, but I think we know how the story turns out...

Norman Ramsey
A: 

16-bit stereo sound cards, via Soundblaster 16. The next big thing was cheap multichannel wavetable cards, like the Gravis Ultrasound.

spoulson
+7  A: 

In the world? Here I hold Computer Design magazine from August 1983 (3 years before I started). It says:

"ARPANET has good potential"

"C language: key to portability (comparison with Pascal source code to show how concise it is)"

"Apple's new Lisa - how do you do stuff with a mouse and a GUI"

"Microsoft's MS-DOS 2.0 contains many enhancements, including directory support"

Not very different than today, eh? :) Internet. C. Microsoft is seasoning an existing product as the next big thing while Apple is creating something that is so revolutionary that people will have to wait for all their friends to buy it before they get convinced.

However, locally to Eskisehir, Turkey where I grew up and started programming, programming itself was the next big thing. I think that's one of the things that got me into it, amazed people asking "how did you do this?".

ssg
I remember MS DOS 2. And I do mean "MS" (not "PC", which made a difference back then). I remember this crappy PC which held several years worth of never-backed-up accounts, and me given the task of transferring them (via floppy disk) to another PC. I remember that the hard disk was drive A. A: was always the floppy drive, except for that one crappy PC. And I remember inserting that floppy and typing the habitual command to delete all files from a floppy, ready for the next load of files. **THEN** I remember how *none* of the undelete utilities would run on that crappy PC. An Apricot, I think.
Steve314
Sad memories :) Oldest DOS I actually saw with my own eyes was 3.00, running on an IBM PC/AT with no graphics (MDA). If you lost the boot floppy you'd go into ROM BASIC which you couldn't save your code to floppy but only to a tape drive. That was a bad lesson for me as well.
ssg
+3  A: 

Wow... here's a good question finally ;-) What was the next big thing when I started in 1983?

  • Commodore 64, Apple Lisa and personal computing as a whole
  • Pascal language: later was the amazing Turbo Pascal 3.01 (39Kb editor+compiler- beat this!)
  • dBase III (Ashton-tate)
  • Framework I (Ashton-tate) the first integrated suite for PCs

Might remember a few more later 8-)

Etamar L.
+2  A: 

When I started learning programming, in my head, the next best thing was ME.

I changed my mind in a second though... :P

Honestly, I'm fresh out of college. I believe the power is in mobile devices. Still not happy though.

Still waiting for a chip in my head which will do anything a laptop does...

camilo
+1  A: 

it was those perky subroutines

Anurag
+3  A: 

When I started out, the next big thing was Java. All my classes were in C++, but the very next semester after I left school, they started teaching many of the beginning classes in Java instead.

adimauro
+2  A: 

SGML. This was in the early days of HTML, and XML wasn't even on the horizon.

Well, in a way SGML did make it big, except not in its original, very generic (like XML) and very loose (unlike XML) form.

bart
+1  A: 

It's the network, stupid. And just before that, it was the workstation.

Henry
and after that it was the network and after that it was the workstation and after that...
JohnFx
+3  A: 

Object Oriented Programming

cherouvim
+3  A: 

1994

  • RAD (Rapid Application Development - Visual IDE's) programming - VB and Delphi (boy did that catch on).
  • OS/2 - Don't think that went anywhere.
  • Linux
  • Netscape
  • Object Databases (never liked that Idea, still don't - Documentum sends shivers down my spine)
Toby Allen
Yeah! Delphi was way cooler than VB although, AFAIK, VB was first on the market..
Andrei Rinea
+10  A: 

I started coding professionally around the turn of the century and the next big thing, at least in management circles, was...

drumroll...

XML

I still recall receiving an email from my manager at that time simply stating:

"We need to get into XML ASAP - please put in place a process to transfer our
 data using it"

I'm ashamed to say I responded by doing something very similar to this.

richeym
@richeym -- the amazing thing is that if you replace the CSV in the middle with an HTML table, it looks similar to VOTable. Swap out the commas for fixed-width fields, and it could be VOTable with binary serialization. (and um ... VOTable is a recent format by the astronomy community for sharing tabular data)
Joe
+2  A: 

Lets see, slashdot (my /. user id is 13xx) msql, perl5 had just come out. Windows 95 was new.

Actually I recall having a conversation with a flatmate, we had figured out how to make a form in HTML but not what do do with it, that must have been around '92 or '93.

Zachary K
+4  A: 

I started in an after school study on a black and white (or, well, black and green) TRS-80, in about 1982. The personal "microcomputer" was the next big thing at the time, I guess. People were saying that variations of them would be in practically every home within 10 or 20 years. I mostly agreed, but I wasn't sure, because.. they were so expensive and clumsy at the time, and I wasn't sure what most people would use them for. $1,000 was a lot of money back then, and.. computer games and word processing wasn't really quite enough to justify the cost, for most people. (People saw them as being fancy electric typewriters for the rich and/or eclectic.) The Internet really has made a huge difference in the market penetration of the personal computer itself. That and the reduction of it's price (relative to inflation).

I saved my stuff on a cassette tape, which took forever and was always failing. I lost all my work many times. I got my own TRS-80 Color Computer, hooked it up to our family's TV (which rarely got watched back then), and was super jaz'd about having color. I bought books and learned assembler language and worked on game graphics as a hobby.

I added floppy disk drives, and was over the moon about them at first, but they were always getting out of alignment and I was still frequently losing all my work.

But anyway, the "next big thing" I wanted to talk about was the IBM PC Clone. I got my first one in 1986, I think. I remember being soooo happy about finally having a hard disk! Man, what an awesome thing! And it actually worked most of the time! Incredible.

The off-spring of IBM PC Clones ("Wintel" boxes) have now practically taken over the world. Tandy/RadioShack caved in decades ago, then, recently, even Apple finally gave up the fight and started using them. PC Clones succeeded because somehow or another, Intel / Microsoft managed to make it legal for anyone to build and sell an "IBM-compatible", and somebody somewhere somehow managed to even make it legal to compete with Intel and make Intel-compatible CPU's! Maybe someone else knows more about exactly how these things happened?

The same thing that's made them successful, however, has also been their plague - ubiquitous hardware manufacturers of a huge variety of hardware, and stuff getting to market long before there was any well established standard for how the various pieces ought to operate and interact. The way the various sub-industries have come together and cooperated and standardized as well as they have is nothing short of miraculous.

Software development as an industry is a relatively recent phenomenon, let alone the whole spastic group worship of things like ORMs and Agile. That managers expect me to learn and adhere to some silly ephemeral kind of "standard" or "best" practice or another (it's different at every place, and never really any better at any place) is bad enough, but the thing that really galls me is this whole notion of "salaried exempt" - somehow, we've become unworthy of getting paid time and half for overtime! Who ever gave that one up is in serious need of the having the @#$ kicked out of them!

Shavais
A: 

The first version of Flash from Macromedia

lamas
+1  A: 

I started about 1977 on a TRS-80, recall macros in the the assembler as being a wonderful thing.

tgunr
A: 

Definitely multi-tasking and soon windowed GUIs.

At the time the only game in town for a PC for running two programs at a time was TSR (terminate stay resident) programs which were mostly used for system level stuff like memory management (for example QEMM).

I remember being so psyched about a Turbo Pascal toolkit that would manage a basic windowed interface and have pull down menus in my app (think Word for DOS windows, not Windows 95).

This was slightly before Windows 3.0 came out.

JohnFx
+4  A: 

If you are referring to the "next big thing" according to the media, and not the programming community, you have to include Artificial Intelligence.

AI has been the next big thing for as long as I have been working on computers. Yet, the feature set for Skynet is still hung up in marketing meetings.

JohnFx
It's the "don't send cyborgs to the past to kill humans" requirement. It's dull. The project needs a more dynamic, exciting image.
Steve314
For some reason I suspect that Steve Jobs was responsible for the T-1000 trying to one-up BillG on the T-800 design.
JohnFx
+1  A: 

When I was first starting out (circa 1387 AD) everyone was buzzing about this new-fangled thing they were calling the abacus. Many of the best had serious concerns: would it scale? what about updates? localization? Turned out to be a big hit.

Jeff Slutz
abacus really handled all those three things.. my only gripe with it is that it doesn't come with a fart app yet, btw here's a POC - http://www.youtube.com/watch?v=hmRXh3ApswM
Anurag
Fantastic video! I love the one on the left running an abacus from a cloud server.
Jeff Slutz
+6  A: 

Duke Nukem Forever and I'm still hoping it will catch on!

Bryan
The correct name is "Duke Nukem For Never"
M28
+3  A: 

Virtual memory. Wasn't around when I started coding Basic on ABC80 nor on the Amiga, until I got a card with an MMU.

Marcus Lindblom
+2  A: 

Depends on what you mean by "started out." When I was in grade school I convinced my parents to buy a computer after playing around with an Apple ][. They got an Apple ///. That was not the next big thing, but I did learn BASIC and LOGO. After we got a Mac I didn't code much any more for a while (unless you count Hypercard, which was also not the next big thing).

When I took my first CS class in college, we had to learn Scheme (without a doubt, not the next big thing). That was 1995, so the World Wide Web was the next big thing. That one panned out, though there was a lot of skepticism about it initially, and for good reason. The early days of the WWW were pretty ugly: AOL, geocities, compuserve, terrible quality images, animated GIFs, the blink tag, tiled background images, Netscape 1.0, IE 1.0, hideous JavaScript hacks, etc.

When I took my first full time programming job at IBM in 1998, the next big thing was e-commerce. That panned out as well (eventually), though there were plenty of reasons to be skeptical then too. At the time, IBM was using Net.Data and Lotus Domino to build web applications (WebSphere was just starting to gain steam). IBM was also making money hand over fist with this stuff at the time, so it seemed to be a good bet. I left after a year of writing crappy web applications and went back to school. The dot com boom imploded shortly thereafter.

I think every successful Next Big Thing always starts out really ugly, gets really overhyped, dies down a bit, then (if people actually use it to solve real problems) it sticks around until the next Next Big Thing takes over.

chameleonman
+4  A: 

Hard to say. Starting out seemed to last a while, as I did my first programming at age 13 or 14 on a Commodore 64. Back then there was no WWW (there was an internet, but no-one knew about it) and I had no modem, so the programming "next big thing" was something I didn't know much about.

Possibly hardware accelerated graphics/sound - sprites and such - perhaps. But then again, Elite didn't get much benefit from sprites, so the Speccy version was probably about 4 times faster.

The first real programming big thing was probably Turbo Pascal - though the first version I used was 3, so I'm not sure that really counts. I do remember seeing an early version of Windows around that time, and thinking it was a pointless.

Due to my odd history, I was starting out again when Windows for Workgroups 3.11 was around. By then, the GUI was a more obvious next big thing - but GEM on the PC was far from dead, the Lisa had been succeeded by a Mac or three etc. The speculation then was about this mysterious OS called Pink, and the equally mysterious NT. Oh - and if you wanted to be taken seriously, every sentence had to have the words "object oriented" in it. Pink was obviously going to beat NT, for instance, because Pink was object oriented. Even now, I don't know what makes one OS object oriented and another not (I hope it's not just the implementation language), but back then this kind of question just didn't matter.

Steve314
+2  A: 

The seminars that I went to... oh the seminars. Cold Fusion, CSS just became a W3C standard. Flash was just emerging (inspired) from Shockwave, and oh yes COM object.

Kamikaze478
+1  A: 

In the local community college circa 1981, the big thing was "top down programming". When I had my own computer with a working floppy years later, it was a real pain to "top down" your way into a corner by declaring paradigms to exist that couldn't be translated to code.

Arthur Kalliokoski
+2  A: 

EASY! 16-color 4-bit CGA Monitors! Woo-hoo! 16-colors in all their glory! LET'S HEAR IT FOR CYAN!

Atømix
Whooo CGA FTW!!
Arcturus
+2  A: 

28.8 Modems... DOOM...

I started programming in VB 2 years ago...back in the days of AOL Internet.

Tim Carter
+1  A: 

Timesharing systems, Kemeny and Kurtz BASIC, and how computers would revolutionize teaching.

I implemented a 4-user time sharing system with ASR33 teletypes that ran K&K BASIC on the Data General Nova minicomputer (serial #3) in 1969 for company called "Educational Data Systems". The OS I implemented was eventually named ALICE and seemed to have a fair commercial following in the mid 70s; the company renamed itself to "Point4 Data Corp" after they started designing Nova clones with 400 nS cycle times. I tried to point out that it wouldn't be very long before they'd have to name themselves "Point2 Data Corp" but that fell on deaf ears.

You can't imagine how we fought for a few words of memory. Well, maybe you can; we had 8Kwords of 16 bit memory and a 100KB head-per-track swap drive to implement this.

Now I have 6Gb of RAM and the applications I write generate 250Mb of object code that run on machines with 24 CPUs. How times have changed :-{

Ira Baxter
+1  A: 

PHP probably. Perl was where all the magic was happening. At least for web programming.

natebc
A: 

Flash Drives

Netbooks

.NET

NetBeans

jQuery

jean27
+1  A: 

Sprites. I had a VIC-20, and it didn't have sprites like the CBM-64. I always thought sprites would make it so much easier to write games.

Guge
+1  A: 

Hampster (sic) Dance was huge...

Eric
+1  A: 

OOP - Object Oriented Programming.

I recall this "style" of programming becomming widely popular as I was finishing my degree.

Doug Stalter
+1  A: 

A viable desktop Linux and the proliferation of the open-source movement.

The Internet was still young, but it was widespread enough to enable open-source software distribution via the Internet instead of dialing into bulletin boards in far-away lands or requesting floppies by mail. As soon as this happened, several Linux distributions appeared that were reasonably capable of being used as the primary OS on a home computer. The expanding Internet yielded an expanding catalog of high-quality open source software, libraries, and tools within easy reach. As projects like Apache started to take hold, it was clear that open-source software, fueled by a growing Internet population, was going to be making more and more of an impact as time went forward.

bta
+1  A: 

The internet. Damn, I feel old.

Also, "fourth generation languages" (Delphi, Clarion, etc).

The internet ended up faring a little better than the 4GLs, obviously.

kekekela
+1  A: 

Windows 3.0... the first time I learned the lesson to always wait for the X.1 version...

MadMurf
+1  A: 

I came of age with free AOL cd's, geocities, and "coding" in HTML. Blink tags, marquee, mosaic...

Yuck. I am everything that people are nostalgic/rancorous towards on the internet.

Alex Mcp
+4  A: 

For me its the n-Tier architecture since client-server architecture was the most used during that time.

Jojo Sardez
A: 

COBOL V2, CICS Command Level and DB2 - and now I'm moving into RIA, what a lot of water under the bridge!

persistent
+1  A: 

2003, and this crazy movement called “web standards” was getting some traction.

Paul D. Waite
A: 

'C'

When I first started programming around 1981 or so it was just escaping systems programming niche and becoming a serious consideration as a replacement for Fortran.

Cruachan
+1  A: 

The "Next Big Thing" when I started out with programming was the destruction of the Berlin Wall. It was all over the news, radio, papers, and basically everywhere I looked.

Anurag
+1  A: 

UNIX. My college replaced an IBM 1130 with a PDP 11-70 running some sort of pre-v7 UNIX in 1976. My fortune was made.

Charles E. Grant
+1  A: 

When I started programming, computers themselves were the "next big thing". In high school, I knew an engineer who showed me the schematic drawings for a Burroughs 2000, and I was impressed but mystified. When I got to college, I got to do a project on an IBM 1620, using Load-and-Go Fortran II, and having a typewriter and a Calcomp plotter for output. I cut my teeth on that thing, writing a program to design four-bar linkages. It was truly amazing how much better it worked than a slide rule (though not nearly as portable).

Mike Dunlavey
+1  A: 

Acoustic telephone couplers that allowed one to dial into a mainframe computer and interact with it at a blazing 1200 baud. Yahoo! No more punch cards even though we still had to use JCL (for those who know what that means).

Oh, and the C programming language was just getting its feet wet as a serious language. My company sent me to Houston for a week to learn it so that I could mentor my project team on how to program in it.

sizzzzlerz
A: 

I started in 1966. The biggest change in my first few years was the IBM-360 model 67. It introduced time sharing, the ability to run multiple threads at once. Imagine that!

I've seen every language since Fortran come along. I think C# is the best so far, and Linq to SQL is the first programmer's revolution in decades--in terms of productivity improvements.

I really thing ORMs brought the first major revolution since relational database were invented.

Oh, and I still program some on a daily basis. Never feel more comfortable than when I'm programming.

Jerry Rubin
A: 

C (after Pascal, and to replace Asm)

kmontgom
+1  A: 

For me, the next big thing was broadband Internet.

I grew up out in the boondocks (literally!) so we were blessed with the almighty 56k modem.

I recall running several-hundred-foot long ethernet cables across our back alleys so my neighbors and I could play Warcraft. :-D

thinkswan
+1  A: 

I remember sitting on a bed in a hotel with my mom and watching Bill Gates give a demonstration of Windows 95 on TV.
I thought at the time I'd never get to actually use it...

BlueRaja - Danny Pflughoeft
A: 

Apples cocaTouch ^^

Simon
+2  A: 

AJAX was the first "next big thing" i became aware of, back then i decided i'd learn programming, and started with...ASP.

Jorge
A: 

The release of the Fortran-77 standard, followed by the compiler.

Fossaw
A: 

It was certainly FORTRAN, the "language of the class enemy".

Ingo
A: 

Graphical user interfaces

Prolog and expert systems

Multi-threading on PCs

Andy Johnson
A: 

When I started learning programming, XT and AT machines were all the craze. We even had a color monitor in our lab! Turbo C and Turbo C++ were a very big thing back then.

More later...

(Aside: This was at St. Joseph's Evening College Computer Center, Museum Road, Bangalore. I was in the morning batch.)

Vulcan Eager
A: 
  • Designing with web standards (i.e. dogmatic separation of content from presentation)
  • Write-once, Run-Anywhere
  • XML
Rodrick Chapman
+1  A: 

Ruby on Rails, 3 years back when I started programming.

Christy John
A: 

Ruby on Rails...

Cole
A: 

Computers were just about starting to get into homes and classrooms, and we all know how that turned out. Oh yeah, RAM was at 32KB.

bluesmoon
+1  A: 

Cloud computing

Chiddy
A: 

fortran..........

i am a girl
A: 

GUI's - remember that Macintosh ad about how 1984 was not going to be? YouTube video of ad

bigtang
+1  A: 

For me it was fractals and fractal geometry particularly regarding how it could be used to come up with artificially generated landscapes such as this.

Also, Aritificial Intelligence (AI) has been the next big thing since John McCarthy coined the term in 1956.

andand
A: 

Virtual Reality:

The glasses, the games, the bio-feedback gloves and outfits... it was the means of doing surgery from the moon.... it was all going to be VR forever!

Until people realized just how silly they looked wearing those HUUUUGE helmets and awful vdt's-as-glasses. And just how non-practical it was to have a Cray computer in your living room to drive the thing.....

Of course, virtual-reality is still with us, but my point is that it has not BECOME us. We are still living in a mostly analog world.

exoboy
+1  A: 

Limited OOP support in VB4!

JP
That eventually progressed to Limited OOP support in VB6!
John K
+1  A: 

I fondly remember getting horrible headaches using my Timex Sinclair and a horrible black-and-white TV for a monitor. It was really cheap (in every sense of the word), and you'd never think of it as a professional instrument, but it's where I first got my progamming feet wet.

PSU
A: 

Oh, my. I still remember all the jazz. It was 1994, and stuff that made a lot of noise were:

  • the Iomega Zip drive
  • Yahoo
  • Perl 5, and a little bit later
  • Java (after some googling, this was actually 1995, but time passed quite slowly back then :-)
Michael Foukarakis
A: 

Background: I learned to program by doing graphics on a C=64 and later an Apple II.

The C=64 had character graphics, with a bunch of built-in characters for line art. It also had a couple of "high-resolution" modes that were fairly limited, especially with colors, and somewhat difficult to use. Finally, it had up to 8 sprites which were easy to move and even had collision detection, but limited in size, number, color, and so on.

The Apple II had a few different graphics modes, again with varying resolutions and color capabilities. Drawing to the graphics area was easier, with one huge limitation: the graphics memory didn't correspond linearly to what you saw on the screen! After the first line, it would jump down 64 pixels or something. There were BASIC routines to handle this for you, but nothing easy to use from assembly, IIRC. There was also a "shape table" of some sort that I played with, which had some neat features (very different from the Commodore's sprites), but I never put in enough time to really figure it out.

So the Next Big Thing was computers with good graphics programming capabilities, by which I meant:

  • square pixels!
  • linear frame buffer addressing
  • a color model that didn't completely suck

But then I went to college and ended up working mostly on X11 workstations, which had (a) square pixels!, (b) better HLLs and libraries (and assembly language wasn't really feasible) so the addressing became a non-issue, and (c) better colors but an absolutely terrible color model. (Seriously, who can figure out colors in Xlib?)

Today when I want graphics, I tend to write HLL code that generates SVG. The SVG coordinate system has real dimensions (so I don't even need to think about "pixels"). The SVG graphics model combines the best aspects of C=64 sprites, the Apple II shape table, and X11, and is far nicer than either one in every respect. And its color model is perfectly easy to understand.

So I didn't get what I was asking for, but I did eventually get what I really wanted.

Ken