views:

22449

answers:

122

This question arose from comments about different kinds of progress in computing over the last 50 years or so.

I was asked by some of the other participants to raise it as a question to the whole forum.

The basic idea here is not to bash the current state of things but to try to understand something about the progress of coming up with fundamental new ideas and principles.

I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

+19  A: 

I believe Unit Testing, TDD and Continuous Integration are significant inventions after 1980.

krosenvold
Testing first was a very old method that has be ressurected i believe.
John Nolan
That's a software engineering thing, not a "computing" thing
SquareCog
@Dmitriy I find that a bit reductionistic
krosenvold
Yea I would find it hard to believe that nobody had done all that stuff before. Especially unit tests.
Quibblesome
I'd agree with John, for instance Brooks describes a test-first approach in The Mythical Man-Month (1975).
Fabian Steeg
Testing as such is undoubtedly older than 1980, and I'm sure someone also thought it'd be best to test up-front. Lacking a better distinction I'd say all the significant advances in this area are post-1980. I'm sure Lenoardo DaVinci was planning to test his helicopter.
krosenvold
Continuous integration was first done seriously in BBN Lisp 1.85 in the late 60s, which became Interlisp at PARC. Smalltalk at PARC in the 70s was also a continuous integration system.
Alan Kay
Jay Bazuzi
+2  A: 

MPI and PVM for parallelization.

duffymo
No, concurrent and distributed programming has been considered the "next big thing" since at least the 60s/70s.
BobbyShaftoe
MPI really is some ancient technology. It's awesome that you can write fast parallel code in C but, gag, you shouldn't have to do it at such a low level! (cf. shading languages/CUDA/GPGPU).
Jared Updike
I thought there were MPI bindings for more modern languages, like Java. http://i.cs.hku.hk/~lchen2/javampi.html
duffymo
It amazes me how little modern programmers know about past programming. This is a classic example. What's next? Thin clients?
Stu Thompson
+27  A: 

Outside of hardware innovations, I tend to find that there is little or nothing new under the sun. Most of the really big ideas date back to people like von Neumann and Alan Turing.

A lot of things that are labelled 'technology' these days are really just a program or library somebody wrote, or a retread of an old idea with a new metaphor, acronym, or brand name.

frankodwyer
You can't see the forest since all the trees are in the way... The building blocks are much the same, but the result has changed/evolved.
Johan
...That's the definition of technology ;) "the practical application of knowledge..."
steamer25
I agree it's time for the next big thing. I'm tired of all the re-packing of things forgotten from the past as something new. Like Javascript = AJAX.
James
+4  A: 

The changes to infrastructure to allow accessible internet from home and office.

Documented and accepted standards from W3C through to APIs

Apart from that most of what we'd think of as new dates back a lot longer than you'd think (e.g. GUI, OOP).

Richard Harrison
+1  A: 

Adoption of Object Orientation.

The idea was around earlier (e.g. Simula), but it became mainstream in the 1990s. (IMHO, one of its greatest benefits is having providing a common vocabulary amongst developers, so its widespread adoption made it much more valuable.)

Oddthinking
"OO was around earlier (e.g. Simula)... What a beautiful answer to a question from Alan Kay. :-)
Ahruman
To expand your comment, Alan Key is the inventor of Smalltalk, the first hugely relevant OOP language (I think Simula died early, in practical use). The first mainstream Smalltalk was Smalltalk-80 actually :-).
Blaisorblade
@[Blaisorblade]: Honored to have Dr. Kay on this humble site - nevertheless, Simula was technically the first OOP language. Smalltalk was the first "pure" OO environment, i.e. where everything was an object.
Steven A. Lowe
Didn't he also come up with the term OO?
bruceatk
Ooops! I didn't look at the name of the question-asker, and if I had there is no way I would have believed it was *that* Alan Kay! I also would have gushed embarrassingly about how OO changed my (software development) life, so perhaps it was for the best.
Oddthinking
There were several systems that were as "object-oriented" as Simula I, including a file system (early 60s) in USAF, Sketchpad (1962), the B5000 hardware. The stuff that I gave the term "object oriented" to was a somewhat different orientation that was sparked by these earlier systems (and Biology)
Alan Kay
I work mostly in object-oriented languages and I don't see much evidence of the widespread, commercial adoption of object-oriented programming. :-p
cartoonfox
+27  A: 

One thing that astounds me is the humble spreadsheet. Non-programmer folk build wild and wonderful solutions to real world problems with a simple grid of formula. Replicating their efforts in desktop application often takes 10 to 100 times longer than it took to write the spreadsheet and the resulting application is often harder to use and full of bugs!

I believe the key to the success of the spreadsheet is automatic dependency analysis. If the user of the spreadsheet was forced to use the observer pattern, they'd have no chance of getting it right.

So, the big advance is automatic dependency analysis. Now why hasn't any modern platform (Java, .Net, Web Services) built this into the core of the system? Especially in a day and age of scaling through parallelization - a graph of dependencies leads to parallel recomputation trivially.

Edit: Dang - just checked. VisiCalc was released in 1979 - let's pretend it's a post-1980 invention.

Edit2: Seems that the spreadsheet is already noted by Alan anyway - if the question that bought him to this forum is correct!

Daniel Paull
I had thought of this answer, but Visicalc was released just a smidgin before the 1980 deadline. (http://en.wikipedia.org/wiki/VisiCalc)
Oddthinking
@oddthinking - jinx. Just edited the post with same info.
Daniel Paull
but this reveals an interesting point: just presenting a simple way to display and manipulate data created a incredibly useful class of tools. is there some other 'enabling' idea like this? do we need one? i think so.
Javier
See also: http://stackoverflow.com/questions/357813/help-me-remember-a-quote-from-alan-kay
splattne
I agree wholeheartedly. Automatic dependency analysis could be and should be a part of modern programming languages.
Jesse Pepper
Nope, you are right. Visicalc was pre-1980
Alan Kay
I don't understand what are spread sheets and why/how do people use them!
hasen j
+171  A: 

Free Software Foundation (Established 1985)

Even if you aren't a wholehearted supporter of their philosophy, the ideas that they have been pushing, of free software, open-source has had an amazing influence on the software industry and content in general (e.g. Wikipedia).

Oddthinking
unix itself was born as a collaborative freely distributed project of Bell Labs and collaborators (and subsequently UC Berkeley and other sources with variants and contributions.) It has taken some ugly detours, but it's effectively Open Source now because the cat was out of the bag from birth.
le dorfier
Most database technology was born the same way. In many senses, FSF and its ilk are simply restoring what had previously been provided by educational and corporate basic research facilities.
le dorfier
Agree that FSF has been very influential, but there is a tendency among its advocates to espouse "group think". So many FSF cannot accept that Apple OSX and MS Windows are much better than any open source OS for the average user. No one wants to admit that.
RussellH
The entire purpose of the FSF is to promote software that can be freely used, modified, and redistributed by all. OSX and Windows are not "better" at this by any definition.
Adam Lassek
@RussellH: you're confusing "Open Source" and "Free (as in Freedom) Software". Your comment, in fact, illustrates precisely why the distinction is important. But anyway, Firefox is better than Internet Explorer and Safari, and it's more important to users than Windows vs MacOS vs Linux.
niXar
niXar: how is RussellH confusing Open Source and Free Software? Can you please point to their definitions and tell me any difference that has to do with RussellH's post? Can you point to any license that's one and not the other?
Jonas Kölker
I will never emrace communist principals. How did this get upvoted so much?
Janie
Janie, you don't have to be a supporter to see that the principles that they are pushing have had a major effect on the industry. I have no interested in getting dragged into a discussion as to whether the FSF is communistic, or whether you should embrace some communist principles.
Oddthinking
@Jonas: It's a bit like the difference between the People's Front of Judea and the Judean People's Front, but only a bit. There's a definite difference between OS and FS: http://www.gnu.org/philosophy/open-source-misses-the-point.html
outis
Legal invention, not computing invention.
Charles Stewart
+1  A: 

Utilization of functional programming/languages within OS core development.

sharkin
Depending on what you consider a functional language to be, LISP was invented in the 1950s, APL was invented in the 1960s and John Backus (of BNF fame) gave us FP in the 1970s.
Jason
Yes, misguiding, I'll edit
sharkin
still wrong, unfortunately. there were LISP machines long ago, i don't think there's anything so 'core' than that nowadays.
Javier
"Can Programming Be Liberated from the von Neumann Style?" is from 1978. The attempt to apply FP to writing an OS is from Turner in 1985, and gave rise to the whole industry of functional I/O. +1
Charles Stewart
+12  A: 

Open Source community development.

sharkin
Actually, the SIG/M user group disks kind of pre-date what we now call open source. It contained hundreds of disks (of the floppy variety) full of CP/M software, much of it open source (although the term "open source" didn't exist then).
Mike Thompson
In the sense of open cooperation and development among people who had access to a computer, it's much like the IBM user groups in the 1960s. It's just that more people can afford computers now.
David Thornley
Agree with david, it's only become more prominent now as computers have moved from the education and scientific areas into the business world, this gave rise to "closed source" software, confusing licenses. It was always there, it just didn't need a name until the lawyers got involved.
sascha
Yes, I must also agree with David here. Open Source is way earlier than 1980. Predates it by at least 20 years. I thought it was the 1950s not the 1960s though.
Brendan Enrick
+106  A: 

I think it's fair to say that in 1980, if you were using a computer, you were either getting paid for it or you were a geek... so what's changed?

  • Printers and consumer-level desktop publishing. Meant you didn't need a printing press to make high-volume, high-quality printed material. That was big - of course, nowadays we completely take it for granted, and mostly we don't even bother with the printing part because everyone's online anyway.

  • Colour. Seriously. Colour screens made a huge difference to non-geeks' perception of games & applications. Suddenly games seemed less like hard work and more like watching TV, which opened the doors for Sega, Nintendo, Atari et al to bring consumer gaming into the home.

  • Media compression (MP3s and video files). And a whole bunch of things - like TiVO and iPods - that we don't really think of as computers any more because they're so ubiquitous and so user-friendly. But they are.

The common thread here, I think, is stuff that was once impossible (making printed documents; reproducing colour images accurately; sending messages around the world in real time; distributing audio and video material), and was then expensive because of the equipment and logistics involved, and is now consumer-level. So - what are big corporates doing now that used to be impossible but might be cool if we can work out how to do it small & cheap?

Anything that still involves physical transportation is interesting to look at. Video conferencing hasn't replaced real meetings (yet) - but with the right technology, it still might. Some recreational travel could be eliminated by a full-sensory immersive environment - home cinema is a trivial example; another is the "virtual golf course" in an office building in Soho, where you play 18 holes of real golf on a simulated course.

For me, though, the next really big thing is going to be fabrication. Making things. Spoons and guitars and chairs and clothing and cars and tiles and stuff. Things that still rely on a manufacturing and distribution infrastructure. I don't have to go to a store to buy a movie or an album any more - how long until I don't have to go to the store for clothing and kitchenware?

Sure, there are interesting developments going on with OLED displays and GPS and mobile broadband and IoC containers and scripting and "the cloud" - but it's all still just new-fangled ways of putting pictures on a screen. I can print my own photos and write my own web pages, but I want to be able to fabricate a linen basket that fits exactly into that nook beside my desk, and a mounting bracket for sticking my guitar FX unit to my desk, and something for clipping my cellphone to my bike handlebars.

Not programming related? No... but in 1980, neither was sound production. Or video distribution. Or sending messages to your relatives in Zambia. Think big, people... :)

Dylan Beattie
I think media compression is not a new concept (it goes back to Shannon's work in 50s), it's just become feasible with improved hardware (fast enough, able to play the media).
porneL
I would have to agree with fabrication being something I think may be one of the next big things. When object "printers" become mainstream (printers that can replicate simple physical items that are durable) I think we will be there.
Andy Webb
It would also be great to scan existing items so replacements can be made. I have on many occasions had to shop for an odd screw or part to replace one that broke around the house or on my bike. With such a system I could scan the old part, repair it in software, and then create the replacement.
Andy Webb
And if you see piracy purely as a problem, you will _hate_ that future. :-)
Ahruman
@Ahruman - have you read "Printcrime" by Cory Doctorow? Short story dealing with exactly that subject... http://craphound.com/?p=573
Dylan Beattie
I agree the invention of MP3 and other non-lossless compression methods was very significant
kohlerm
Interesting story Dylan and interesting thought on how piracy fits into all of this.
Andy Webb
Desktop publishing and high quality printing was invented at Xerox PARC in the 70s, some of the Altos back then also had high quality color screens. The Internet predated 1980. Media compression predated 1980. The question is about what fundamental new technologies have been invented since 1980
Alan Kay
Agree with Alan there - None of these are new inventions, all the items mentioned above are simply advancements in technology of older concepts.
sascha
Didn't the Apple II have colour before 1980?
Tom Hawtin - tackline
You sir, are a visionary. Do not let the man get you down. 'Printing' printers is the next big revolution.
Waylon Flinn
Fabricating objects at home is already well on its way. Check out 3D printing: http://en.wikipedia.org/wiki/3D_printing
Peter Di Cecco
am not familiar to the idea of 3D printing, but what you talk of sounds like a nanofactory (http://en.wikipedia.org/wiki/Molecular_assembler#Nanofactories)
Tshepang
+1  A: 

'Singularity', and all projects like it, i.e. development of operating systems in managed code.

sharkin
again, LISP machines and APL code were the original ideas... and failures.
Javier
That's not a post-1980 invention (Lisp and Smalltalk).
Jules
+6  A: 

Effective Parallelization and Quantum Computing - I think these are two areas where progress has been made and much more progress will be made to make very significant changes to our use of computing power.

Effective Parallelization meaning parallelizing and distributing processing without the need for special programming techniques, but where it is built into the compiler/framework.

Cade Roux
Both of them are still promising but not widely used. Especially quantum computing - hey, you could break RSA, but up to now factoring 15 is an amazing achievement. And while the complexity of buiding classical computers scales linearly, the one for quantum computer "scales exponentially".
Blaisorblade
The Burroughs B5000 designed in 1961 and deployed in 1962-3 was shipped with multiple CPUs and a higher level language and automatic hardware support to allow this to be done safely
Alan Kay
+15  A: 

HTM systems (Hiearchical Temporal Memory).

A new approach to Artifical Intelligence, initiated by Jeff Hawkins through the book "On Intelligence".

Now active as a company called Numenta where these ideas are put to the test through development of "true" AI, with an invitation to the community to participate by using the system through SDKs.

It's more about building machine intelligence from the ground up, rather than trying to emulate human reasoning.

sharkin
When they do something interesting, I will be the first and loudest leader of the applause
Alan Kay
Wow. I need to learn more about this!
Daddy Warbox
+82  A: 

Package management and distributed revision control.

These patterns in the way software is developed and distributed are quite recent, and are still just beginning to make an impact.

Ian Murdock has called package management "the single biggest advancement Linux has brought to the industry". Well, he would, but he has a point. The way software is installed has changed significantly since 1980, but most computer users still haven't experienced this change.

Joel and Jeff have been talking about revision control (or version control, or source control) with Eric Sink in Podcast #36. It seems most developers haven't yet caught up with centralized systems, and DVCS is widely seen as mysterious and unnecessary.

From the Podcast 36 transcript:

0:06:37

Atwood: ... If you assume -- and this is a big assumption -- that most developers have kinda sorta mastered fundamental source control -- which I find not to be true, frankly...

Spolsky: No. Most of them, even if they have, it's the check-in, check-out that they understand, but branching and merging -- that confuses the heck out of them.

merriam
If one should count as a significant new invention, it's git.
hasen j
hasen j: git is a fantastic DCMS, however there were several others implemented before git - git, is a significant new -implementation- of an idea.
Arafangion
+222  A: 

The Internet itself pre-dates 1980, but the World Wide Web ("distributed hypertext via simple mechanisms") as proposed and implemented by Tim Berners-Lee started in 1989/90.

While the idea of hypertext had existed before (Nelson’s Xanadu had tried to implement a distributed scheme), the WWW was a new approach for implementing a distributed hypertext system. Berners-Lee combined a simple client-server protocol, markup language, and addressing scheme in a way that was powerful and easy to implement.

I think most innovations are created in re-combining existing pieces in an original way. Each of the pieces of the WWW had existed in some form before, but the combination was obvious only in hindsight.

And I know for sure that you are using it right now.

splattne
+1 for the most obvious but also the most easily forgotten because we all take it for granted :)
PolyThinker
I'm not using the World Wide Web right now. I'm using a series of tubes known as the internets, achieved via the google.
Robert S.
@le dorfier, The World Wide Web is a system of interlinked hypertext documents accessed via the Internet, it's not TCP/IP networking. World Wide Web was begun in 1989.
Roberto Russo
WWW is an implementation of hypertext. Hypertext was invented in the 60's.
bruceatk
@bruceatk, I guess every invention is a combination of existing parts - and hypertext is a very important component, but not the only one which made WWW such a success.
splattne
@bruceatk: Hypertext is an implementation of text. Text was invented in 3500 BC.
Portman
@aplattne/Portman - I see WWW as the logical progression of hypertext that took off when the environment was ready for it. I can admit that it is enough of a leap to be considered an invention. The invention part was actually written about in 1980 by Tim Berners-Lee, so it probably predates 1980.
bruceatk
C:\ has become http://
Roberto Russo
@Roberto, and: DIR has become http://www.google.com/
splattne
@bruceatk: I don't believe he wrote about the WWW until 1989. http://www.w3.org/People/Berners-Lee/
Portman
+1 for mentioning xanadu, which in my opinion was much better concept (in terms of scalability and wiki-style versioning) than how WWW got implemented.
dusoft
@splattne: And think has become search
kaizer.se
3 people missed the upvote :/
Martin
+17  A: 

I started programming Jan 2nd 1980. I've tried to think about significant new inventions over my career. I struggle to think of any. Most of what I consider significant were actually invented prior to 1980 but then weren't widely adopted or improved until after.

  1. Graphical User Interface.
  2. Fast processing.
  3. Large memory (I paid $200.00 for 16k in 1980).
  4. Small sizes - cell phones, pocket pc's, iPhones, Netbooks.
  5. Large storage capacities. (I've gone from carrying a large 90k floppy to an 8 gig usb thumb drive.
  6. Multiple processors. (Almost all my computers have more than one now, software struggles to keep them busy).
  7. Standard interfaces (like USB) to easily attach hardware peripherals.
  8. Multiple Touch displays.
  9. Network connectivity - leading to the mid 90's internet explosion.
  10. IDE's with Intellisense and incremental compiling.

While the hardware has improved tremendously the software industry has struggled to keep up. We are light years ahead of 1980, but most improvements have been refinements rather than inventions. Since 1980 we have been too busy applying what the advancements let us do rather than inventing. By themselves most of these incremental inventions are not important or powerful, but when you look back over the last 29 years they are quite powerful.

We probably need to embrace the incremental improvements and steer them. I believe that truly original ideas will probably come from people with little exposure to computers and they are becoming harder to find.

bruceatk
"original ideas will probably come from people with little exposure to computers" so true. and even more sad since most of that 'numbing' exposure is windows/office.
Javier
Some dates for earlier inventions: Engelbart's GUI was demoed in 1968 and the Xerox PARC Alto was developed in 1973. Multiple CPUs are new on the desktop, but not in the machine room -- the VAX cluster was first available in 1978.
Hudson
You were programming before I was born. Dang I have a long way to go.
Kezzer
Ouch. I didn't start until I was 26, now I really feel old. :)
bruceatk
Did you factor in inflation for that $200 16k memory chip?
Tim Tonnesen
@Tim Tonnesen - I paid $200 1980 dollars for that 16k. I don't know what it would be now. It was for an Atari 800 that I paid $750.00 for with 24k. $444.00 for a 90k floppy drive. I just looked it up. $200.00 is $497.17 in 2007, $750 is $1874 and $444 is $1103.
bruceatk
I note that 8 of the 10 are hardware improvements. The remaining are GUIs and IDE technology. GUIs are from the 60's or 70's (http://en.wikipedia.org/wiki/History_of_the_graphical_user_interface). So in 30 years, all that's new, software-wise, is IDE autocompletion? That make me a sad panda :(
Jonas Kölker
+34  A: 

JIT compilation was invented in the late 1980s.

Jasper Bekkers
Well, the whole work on the implementation of the Self language (which was completely JIT-compiled) was amazing, and its usefulness can be seen today for Javascript inside Google V8. And that's from the late '80s and early '90s.
Blaisorblade
I first saw this idea in the last chapter of John Allen's book Anatomy of Lisp, published in the 70s. He gave a ref to a 70s PhD thesis as the originator.
Darius Bacon
Maybe we should refine it to "profile based adaptive JIT compilation" such as the Self JIT or Suns' Java Hotspot
kohlerm
One of the PhD theses in the early 1970s which had JIT was Jim Mitchell's at CMU -- he later went to PARC -- Alan Kay
Alan Kay
If JIT is defined as what Self did, like Wikipedia defines it, then it really seems to be an 80s concept. But, if you do define it like that, what is the really important concept: the bytecode compilation that goes way back, or the optimization that JIT represents?
Daniel
As I mentioned right before your post ....
Alan Kay
Nori, K.V.; Ammann, U.; Jensen; Nageli, H. (1975). The Pascal P Compiler Implementation Notes. Zurich: Eidgen. Tech. Hochschule. (Thanks wikipedia)
Arafangion
A: 

A really hard question since, aside ridiculously improved hardware, there's few things that'd have been significantly positive inventions after that time. Though there are many significant inventions before 1980s that affect people only but now because they were infeasible back then.

Heck. Descent

Cheery
+1  A: 

Not sure about 1980, but the AI community has been an idea-generator for decades, and they're still at it.

Mike Dunlavey
+21  A: 

Computer Worms were researched in the early eighties of the last century in the Xerox Palo Alto Research Center.

From John Shoch's and Jon Hupp's The "Worm" Programs - Early Experience with a Distributed Computation" (Communications of the ACM, March 1982 Volume 25 Number 3, pp.172-180, march 1982):

In The Shockwave Rider, J. Brunner developed the notion of an omnipotent "tapeworm" program running loose through a network of computers - an idea which may seem rather disturbing, but which is also quite beyond our current capabilities. The basic model, however, remains a very provocative one: a program or a computation that can move from machine to machine, harnessing resources as needed, and replicating itself when necessary.

In a similar vein, we once described a computational model based upon the classic science-fiction film, The Blob: a program that started out running in one machine, but as its appetite for computing cycles grew, it could reach out, find unused machines, and grow to encompass those resources. In the middle of the night, such a program could mobilize hundreds of machines in one building; in the morning, as users reclaimed their machines, the "blob" would have to retreat in an orderly manner, gathering up the intermediate results of its computation. Holed up in one or two machines during the day, the program could emerge again later as resources became available, again expanding the computation. (This affinity for nighttime exploration led one researcher to describe these as "vampire programs.")

Quoting Alan Kay: "The best way to predict the future is to invent it."

splattne
I believe this work actually predates the 80s.
BobbyShaftoe
Charles Stewart
+29  A: 

Software:

  • Virtualization and emulation

  • P2P data transfers

  • community-driven projects like Wikipedia, SETI@home ...

  • web crawling and web search engines, i.e. indexing information that is spread out all over the world

Hardware:

  • the modular PC

  • E-paper

mjy
Virtualization was implemented on VM/CMS in 1972. What do you mean by "the modular PC"?
Hudson
I think P2P and wikipedia remarkably changed the world
dr. evil
I think that by "the modular PC" he means that anyone can buy almost interchangeable components and build their own computer.
p5ycho_p3nguin
P2P was invented at Xerox PARC in the 70s -- the Altos were all P2P and the file resources and printers and "routers" were all P2P Altos
Alan Kay
+1 for E-paper. Not the rest.
sascha
I saw "E-paper" and thought, what? how does that effect me day to day. I'm glad it exists but e-Readers are not very important technologies on a widespread basis, compared to say, the cellphone or iPod.
Jared Updike
I'm still looking for a decent, cheap eReader.. and I still didn't find one that can totally replace true dead tree edition.
kkaploon
I'd like to point out that about 40-50 years ago everyone was still doing math on paper mainly and saying the same about computers...
RCIX
+1  A: 

To answer a slightly different question. I think we need big ideas in the areas of Privacy, Trust and Reputation. My computer has the ability to capture almost everything about me, where I am, what I say, what I type, what I see,... A huge amount of information with an equally large number of entities (people, shops, sites, services) with whom I might want to share some of that information even if it's just a single piece of data.

My information needs to mine (not Google's, Facebook's or Apple's). My computer needs to use it on my behalf and so trust needs to be end-to-end. Then we can dis-intermediate the new information middle men.

it depends
So, your answer is more about 1984, not 1980.
splattne
:-) Yes. I want to dismantle the Ministry of Search.
it depends
Our cell phones are now capable of sampling our location geochronologically (i.e. in four dimensions) to a resolution of 1 sec. of time; and automatically submitting it to the phone network. Asynchronously queued, for efficiency. A conceptual technology pattern increasingly discussed among us here.
le dorfier
Like I say it doesn't address the original question, but today reputaion, etc.., is typically done through an intermidiary. Google,PPal or FaceBook are today's Ma Bell, the comms is end2end but tust is too oftern through a middle man; it needs to be end2end too.
it depends
+10  A: 

Ideas around Social Computing have had advances since the 1980. The Well started in 1985. While I'm sure there were online communities before, I believe some of the true insights in the area have happened post 1980. The adverse dynamic aspects of social communities and their interaction on a software system are much like the disasters of the Tacoma Narrows Bridge.

I think Clay Shirky's work in the area illuminates those effects and how to mitigate them. I'd say interesting real world examples of social software insights include things like reCAPTCHA and Wikipedia, where significant valuable work is done by the participants mediated by the software.

Steve Steiner
Check out what Engelbart was really about, starting in 1962
Alan Kay
fd
One could also go back to Vannevar Bush and Memex. Vannevar's work doesn't negate Engelbart's. I doubt anything can be truly said to be without precedent.
Steve Steiner
Also consider Control Data's PLATO CAI system, which had substantial social interactions - circa 1965-72.
Eric Brown
+15  A: 

Nothing.

I think it's because people have changed their attitudes. People used to believe that if they would just find that "big idea", then they would strike it rich. Today, people believe that it is the execution and not the discovery that pays out the most. You have mantras such as "ideas are a dime a dozen" and "the second mouse gets the cheese". So people are focused on exploiting existing ideas rather than coming up with new ones.

So many of the existing ideas just haven't been implemented yet.
Breton
There are always a few lunatics that will come up with new ideas, they just can't help it ;-)
Johan
But they're lunatics, so they can't sell their ideas because nobody will listen to them.
Adam Jaskiewicz
Ideas are more the province of artists. Practical implementation is what we guys do. Looking at engineers for brand new ideas is kind of fishing in the wrong pond. For bright new ideas, read Sf and figure out how this stuff could be done (I figure a lot of it could be done). However, implementing a wild idea can take years. Artists can get away selling ideas and dreams, but engineers are expected to come up with products... and they have to eat too.
Sylverdrag
+18  A: 

The use of Physics in Human Computer interaction to provide an alternative, understandable metaphor. This combined with gestures and haptics will likely result in a replacment for the current common GUI metaphor invented in the 70's and in common use since the mid to late 80's.

The computing power wasn't present in 1980 to make that possible. I believe Games likely led the way here. An example can easily be seen in the interaction of list scrolling in the iPod Touch/iPhone. The interaction mechanism relies on the intuition of how momentum and friction work in the real world to provide a simple way to scroll a list of items, and the usability relies on the physical gesture that cause the scroll.

Steve Steiner
The earliest example I can think of was Randy Smith's Alternate Reality Kit, built in Smalltalk-80 at PARC in '86 or '87. You could implement new objects with a physical metaphor. Every object had location, mass, momentum, and a pop-up menu for interacting with it via its message interface.
PanCrit
+2  A: 

(Widespread) Encryption. Without Encryption no financial transaction would ever take place. And this is still an area which can use more innovation and user friendlieness.

Mauli
When did the trapdoor and public key ideas get invented? Hint: before 1980
Alan Kay
+19  A: 

Mobile phones.

While the first "wireless phone" patent was in 1908, and they were cooking for a long time (0G in 1945, 1G launched in Japan in 1979), modern 2G digital cell phones didn't appear until 1991. SMS didn't exist until 1993, and Internet access appeared in 1999.

Domchi
Japan in 1979, that's pre 1980. We're looking for new inventions - think research labs, universities, practical demonstrations of patent applications... all which will predate the mass-market availability by a number of years.
sascha
The difference between 1G and 2G is about as big as difference between analog and digital computer. I think 2G (1991) deserves the status of "new" invention.
Domchi
And is dependent on powersave technologies and good batteries.
Johan
+53  A: 

What about digital cameras?

According to Wikipedia, the first true digital camera appeared in 1988, with mass market digital cameras becoming affordable in the late 1990s.

Domchi
But the idea, the invention and the patents were there in the early 70's (See the section on "Early Development")
sascha
That's a good point, I stand corrected.
Domchi
Digital camera? One wonders, judging from up votes, what people understand today by the term "computing".
MaD70
Pictures is what modern consumer computing is based around. Without a webcam, a point-and-shoot or expensive SLR (for newspapers), modern consumers wouldn't really need computers.
Marius
@MaD70: I guess you're not so much into photography, are you? Just to name a few: automatic face recognition, autofocus, "panoramic mode", automatic white balance ... it definitely falls into computing.
nico
@nico: one doesn't need to be an expert in digital photography to appreciate the algorithmic sophistication of software on digital cameras. I wanted to mean: programmable electronics is pervasive these days, if you refer to all these applications as "computing" then nearly everything is computing. For me "computing" is in such algorithms and they are certainly not bound to digital camera hardware.
MaD70
@MaD70: the developing of those algorithms has been strongly pushed by the new camera hardware (and viceversa) so, I would say that even if they're not strictly bound they are definitely strongly related.
nico
@nico: I'm not an expert in that field (the last thing I read was "Principles of pictorial information systems design", by Shi-Kuo Chang, which is a 1989 book) but I seriously doubt that some (many?) of those algorithms are quite new and not adaptations of old algorithms from other image processing fields instead (satellite images processing, for example). Of course, you can be an expert in digital image processing and up-to-date with current research **and** knowledgeable of its history. In such case I'll shut up: *ubi maior minor cessat*.
MaD70
@MaD70: Oh, I wouldn't consider myself a super expert in that. Anyway I guess you will agree with me that it is pretty normal for algorithms, as it is for hardware, to evolve and adapt from previous ones.Anyway you should check out some of the new algorithms used in modern microscopy... that's absolutely stunning new stuff (ok, it's probably not the type of digital cameras he was talking about but still...)
nico
Sorry, the first prototype digital camera was made by Kodak in 1975 apparently. http://pluggedin.kodak.com/post/?ID=687843
Mark Ransom
+2  A: 

I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

The way that I see it, we have not had so many new ideas in computing because we largely haven't needed them. We have been milking the old ideas, and getting so much out of them, such as the phenomenal growth of cpu speed.

When we need new ideas because the "well has run dry" so to speak, then we will see that necessity is the mother of invention.

Alex Baranosky
Blaisorblade
Yes, I know it is coming to an end, and so I am sure a new chapter of computing will be born from it. Necessity id the mother of invention, right?
Alex Baranosky
I think it's clear at this point that advancement in cpus is coming more from parallelization than speed.
Adam Lassek
+2  A: 

I would also nominate 3D mouse. There are several variants in existance from early 1990s. For anyone working with 3D, things like SpaceNavigator make life much easier. (Disclaimer: I'm not affiliated with 3Dconnexion in any way, just satisfied and now RSI-free user.)

Domchi
+3  A: 

The one activity I can think of that wasn't there in 1980 was Global Searching Across Disjoint Domains. i.e. google and a (very few) predecessors - all of which were well post-1980. Associated with conventions for syntactic markup,I think it qualifies as a "new idea"; but I think it also has only just begun; there's a lot of overhead space to build up into.

One device that has the potential to accelerate this already lightning-speed vector will soon emerge as the combination camera/GIS/phone/network. It creates the opportunity to automatically collect, classify, and aggregate datapoints in four-dimensional space for the first time. Even tedious manual collections of this type of data are sprouting; imagine when it's done by default.

For better or worse.

le dorfier
+5  A: 

As for programming concepts, IoC / Dependancy injection in 1988 with roots in 1983. Fowler has some notes on the history of the concept on his Bliki.

Domchi
+4  A: 

I think the laptop was invented around 1980 and I also think that the development of laptops and portable computing changed a lot of people's lives - certainly those of us who work in IT, or who use computers and travel.

Brabster
You do know that Dr. Kay originated the idea for the laptop, known as the Dynabook back then. Even as late as 1994, when I first read about the Dynabook, I hoped that something as "good" as its design would come to market. And here we are.
Robert S.
+4  A: 

I'd say the biggest trend is an ever increasing lack of location dependence and pervasiveness. An interesting philosophical exercise these days is to count the computers in you immediate area. They're everywhere desktops, keyboards, microwaves, radios, televisions, cell phones etc... My grandmother computer is illiterate however her life is as infested with small computers as everyone else's. She can make a call to me from the middle of an empty field. I can then answer that call zipping down the highway.

+23  A: 

To address the two questions about "Why the death of new ideas", and "what to do about it"?

I suspect a lot of the lack of progress is due to the massive influx of capital and entrenched wealth in the industry. Sounds counterintuitive, but I think it's become conventional wisdom that any new idea gets one shot; if it doesn't make it at the first try, it can't come back. It gets bought by someone with entrenched interests, or just FAILs, and the energy is gone. A couple examples are tablet computers, and integrated office software. The Newton and several others had real potential, but ended up (through competitive attrition and bad judgment) squandering their birthrights, killing whole categories. (I was especially fond of Ashton Tate's Framework; but I'm still stuck with Word and Excel).

What to do? The first thing that comes to mind is Wm. Shakespeare's advice: "Let's kill all the lawyers." But now they're too well armed, I'm afraid. I actually think the best alternative is to find an Open Source initiative of some kind. They seem to maintain accessibility and incremental improvement better than the alternatives. But the industry has gotten big enough so that some kind of organic collaborative mechanism is necessary to get traction.

I also think that there's a dynamic that says that the entrenched interests (especially platforms) require a substantial amount of change - churn - to justify continuing revenue streams; and this absorbs a lot of creative energy that could have been spent in better ways. Look how much time we spend treading water with the newest iteration from Microsoft or Sun or Linux or Firefox, making changes to systems that for the most part work fine already. It's not because they are evil, it's just built into the industry. There's no such thing as Stable Equilibrium; all the feedback mechanisms are positive, favoring change over stability. (Did you ever see a feature withdrawn, or a change retracted?)

The other clue that has been discussed on SO is the Skunkworks Syndrome (ref: Geoffrey Moore): real innovation in large organizations almost always (90%+) shows up in unauthorized projects that emerge spontaneously, fueled exclusively by individual or small group initiative (and more often than not opposed by formal management hierarchies). So: Question Authority, Buck the System.

le dorfier
I loved Framework, and you can still buy it, but it's expensive.
Norman Ramsey
Yup. www.framework.com. I wish.
le dorfier
It's always easier to have new ideas in a new area of knowledge, so a very large number of the important ideas came about in the 1950s and 1960s. We just can do most of them a whole lot better now.
David Thornley
I think this reply and the comments are very well put.
Alan Kay
@David: "whole lot better now". And cheaper. And smaller. Which enables new ways of doing *other* things better. E.g. 10 songs -> 1,000 songs -> 1,000 albums in my pocket, sure it is a matter of degree but it changes everything, even if someone back before 1980 showed it could be done, in theory, on a giant mainframe. The pieces may have been there but some inventions, like the iPod, are more than the sum of the parts.
Jared Updike
+19  A: 

Better user interfaces.

Today’s user interfaces still suck. And I don't mean in small ways but in large, fundamental ways. I can't help but to notice that even the best programs still have interfaces that are either extremely complex or that require a lot of abstract thinking in other ways, and that just don't approach the ease of conventional, non-software tools.

Granted, this is due to the fact that software allows to do so much more than conventional tools. That's no reason to accept the status quo though. Additionally, most software is simply not well done.

In general, applications still lack a certain “just works” feeling are too much oriented by what can be done, rather than what should be done. One point that has been raised time and again, and that is still not solved, is the point of saving. Applications crash, destroying hours of work. I have the habit of pressing Ctrl+S every few seconds (of course, this no longer works in web applications). Why do I have to do this? It's mind-numbingly stupid. This is clearly a task for automation. Of course, the application also has to save a diff for every modification I make (basically an infinite undo list) in case I make an error.

Solving this probem isn't even actually hard. It would just be hard to implement it in every application since there is no good API to do this. Programming tools and libraries have to improve significantly before allowing an effortless implementation of such effords across all platforms and programs, for all file formats with arbitrary backup storage and no required user interaction. But it is a necessary step before we finally start writing “good” applications instead of merely adequate ones.

I believe that Apple currently approximates the “just works” feeling best in some regards. Take for example their newest version of iPhoto which features a face recognition that automatically groups photos by people appearing in them. That is a classical task that the user does not want to do manually and doesn't understand why the computer doesn't do it automatically. And even iPhoto is still a very long way from a good UI, since said feature still requires ultimate confirmation by the user (for each photo!), since the face recognition engine isn't perfect.

Konrad Rudolph
Google's Picasa has had that for a while. In fact, picasa has so many other features that are slowly crawling into iPhoto.
AKRamkumar
A: 

I belive that nothing important was invented.. but the perspective on software changed a lot since the '80s. Back then there were more theoreticians involved in this thing, and now you are asking this question on a programmers 'forum'.

Most of the ideas back then didn't get implemented, or when implemented they didn't had any real importance as the software industry did not exist, nor marketing or HR or development stages, or alpha versions:).

Another reason for this lack of inventions is the fact that most people use Windows:) dont get me wrong, i do hate M$, but look at it this way: you have a perfectly working interface, with nothing new to add to it, maybe just some new colored buttons. Its also closed enough so you wont be able to to anything with it without breaking it. Thats why i prefer open apps, this way you get more "open" people, to whom yo can actually talk, ask then questions, propose new ideeas that actually gets implemented, or at least put on an open todo-list, thus you get some kind of "evolution". You dont really see anything new because you are stuck with the same basic interface "invented" lots of years ago... did anyone actually tried ION window-manager in a production environment? It has a new kind of interface, and actually lets you do things faster, event it it looks quirky

M$, Adobe..you name it,holds lots of patents so you wont be able to base your work on them, or derivatives(you also wont know what kind of undeveloped tehnologies they hold). Look at MP3 and GIF as examples( i belive that they are both free formats now, but they are also kinda dead..) MP3 is the 'king' of audio evend if there are few algorithms out there much better that it..but didnt get enough traction because they weren't pushed on the consumer market. The GIF... come on, 256 colors??? From this point of voew i'm curios how many people from this thread are working on something "open" that will get to be reused in some other projects, and how many on "closed", protected by NDA's projects?

Even if it sounds kinda "free willy" kinda speech, back in the 80's the software was free, you got documentation for everything, and all hardware was more simple and easier to work with... and also more limited, so people didnt actually waste time to implement 3d games or web-pages but worked on real algorithms.

Quamis
Automatic down vote for anyone who writes "M$". That tired old cliche should have been retired from the vicious Slashdot peanut gallery in the late 90s. It's a shame for computer science that website and worn out anti-Microsoft Linux fanboi-ism remains to this day.
Judah Himango
+17  A: 

The rediscovery of the monad by functional programming researchers. The monad was instrumental in allowing a pure, lazy language (Haskell) to become a practical tool; it has also influenced the design of combinator libraries (monadic parser combinators have even found their way into Python).

Moggi's "A category-theoretic account of program modules" (1989) is generally credited with bringing monads into view for effectful computation; Wadler's work (for example, "Imperative functional programming" (1993)) presented monads as practical tool.

Jason Dusek
+9  A: 

I think the best ideas invented since the 1980's will be the ones that we're not aware of. Either because they are so small and ubiquitous as to be unnoticable, or because their popularity hasn't really taken off.

One example of the former is Clicking and Dragging to select a portion of text. I believe this first appeared on the Macintosh in 1984. Before that you had seperate buttons for picking the beginning of a selection, and the end of a selection. Quite onerous.

An example of the latter is (may be) Visual Programming languages. I'm not talking like hypercard, I mean like Max/MSP, Prograph, Quartz Composer, yahoo pipes, etc. At the moment they are really niche, but how I see it, is that there's really nothing stopping them from being just as expressive and powerful as a standard programming language, except for mindshare.

Visual programming languages effectively enforce the functional programming paradigm of referential transparency. This is a really useful property for code to have. The way they enforce this isn't artificial either- it's simply by virtue of the metaphore they use.

VPL's make programming accessible to people who would not otherwise be able to program, such as people with language difficulties, like dyslexia, or even just laymen that need to whip up a simple time-saver. Professional programmmers may scoff at this, but personally, I think it would be great if programming became a really ubiquitous skill, like literacy.

As it stands though, VPL's are reall a niche interest, and haven't really got particularly mainstream.

What we should do differently

all computer science majors should be required to double major- coupling the CS major with one of the humanities. Painting, literature, design, psychology, history, english, whatever. A lot of the problem is that the industry is populated with people that have a really narrow and unimaginative understanding of the world, and therefore can't begin to imagine a computer working any significantly differently than it already does. (if it helps, you can imagine that I'm talking about someone other than you, the person reading this.) Mathematics is great, but in the end it's just a tool for achieving. we need experts who understand the nature of creativity, who also understand technology.

But even if we have them, there needs to be an environment where there's a possibility that doing something new would be worth the risk. It's 100 times more likely that anything truly new gets rejected out of hand, rather viciously. (the newton is an example of this). so we need a much higher tolerance for failure. We should not be afraid to try an idea which has failed in the past. We should not fully reject our own failures- and we should learn to recognize when we have failed. We should not see failure as a bad thing, and so we shouldn't lie to ourselves or to others about it. We should just get used to it, because it is just about the only constant in this ever changing industry. Post mortems are useful in this regard.

One of the more interesting things, about smalltalk, I think, was not the language itself, but the process that was used to arrive at the design of smalltalk. The iterative design process, going through many many revisions- But also very carefully and critically identifying the flaws of the existing system, and finding solutions in the next one. The more perspectives, and the broader the perspectives we have on the situation, the better we can judge where the mistakes and problems are. So don't just study computer science. Study as many other academic subjects as you can get yourself to be interested in.

Breton
As usual there's always a counter-example! The MITSyn stream processing language is a pipe-oriented visual programming language from the early 1970s and is still available.
RobS
Really? Could you cite some documentation about this system please? I'd like to find out more.
Breton
Hrmn just out of curiosity, has it ever struck anyone here how inadequate the metaphore of "language" is for representing a computation, or a program? Imagine the programs we could make if we had a more suitable metaphore. One where getting a semicolon in the wrong place didn't matter.
Breton
Not saying the VPL's of today are "it", but it demonstrates that metaphores other than "language" are possible for representing a computer program, and they each have different advantages and disadvantages.
Breton
Clicking and dragging through text: invented at Xerox PARC in the 70s. GRAIL at RAND in the 60s was both a visual language and tablet driven.
Alan Kay
Damn, this whole thread has just been "Owned" by Alan Kay. But this all kind of proves my point. If there's a significant new idea or invention, none of us would be aware of it. It'll be feircely guarded by whoever owns it, or entirely unrecognized by most everyone as a "good" idea.
Breton
Alan Kay asking us about significant new ideas is a bit like asking flatlanders to imagine the third dimension.
Breton
+5  A: 

Declarative Programming.

In 1979 "computer programs" were imperative. The programmer was expected to instruct the compiler on both what to do and how to do it. (N1)

Today, ASP.NET WebForms and WPF programmers regularly write code without knowing or caring how it will be implemented. Wikipedia has other, less mainstream examples. Additionally, all of the SGML-derived "markup" languages are declarative, and I doubt many of the programmers of 1979 would have predicted their importance or ubiquity in 30 years.

Although the concept of declarative programming existed before 1980 (see this paper from 1975), it's invention took place with the introduction of Caml in 1985 (debatable) or Haskell in 1990 (less debatable). (N2) Since then, declarative programming has increased greatly in popularity. And, when massively multicore processors finally arrive, we'll all be declarative programmers.

--
Notes:
(N1) I can't vouch for this firsthand, since I was a fetus in 1979.
(N2) From other answers, it seems like people are confusing conception with invention. Da Vinci conceived of a helicopter, but he didn't invent it. The question is specifically on inventions in computing.
(N3) Please don't mention Prolog (rel. 1975) in the comments unless you have actually built an app in it.

Portman
Oracle and IBM came in 1979 with commercially available SQL databases, so the use of declarative programming is older than 1980.
tuinstoel
Declarative programming is a bit of an overloaded term. Declarative programming according to microsoft is usually a smart way of using XML to configure an application. Functional languages like Lisp, Scheme and Haskell allow for a different form of declarative programming.
Mendelt
Ivan Sutherland's Sketchpad was completely programmed declaratively and had no imperative features. And it wasn't the last declarative system done before 1980.
Alan Kay
In my humble opinion - there's no such thing as declarative programming. Even if you say only WHAT needs to be done you still know HOW it will be done, and if you don't - weird things will happen from time to time and you will have no clue WHY until you know HOW. That's why humans are needed here.
inkredibl
Suppose this were only about solving linear equations. We supply the relationships that we want to have simultaneously hold, and the solver program solves or says there is no solution. We know how the solver program works, but are programming completely declaratively....?
Alan Kay
Oh dear. I don't know what "Declarative programming" is, but I guess Prolog and ML fit the bill ‹check WP› yup it does.
niXar
@niXar: Caml was 1985. I will update the post now. And as for Prolog... I stand by my original footnote.
Portman
A: 

the Enterprise Service Bus would appear to be a fairly recent 'invention', though of course it is based on much older technologies.

Steven A. Lowe
+70  A: 

Damas-Milner type inference (often called Hindley-Milner type inference) was published in 1983 and has been the basis of every sophisticated static type system since. It was a genuinely new idea in programming languages (admitted based on ideas published in the 1970s, but not made practical until after 1980). In terms of importance I put it up with Self and the techniques used to implement Self; in terms of influence it has no peer. (The rest of the OO world is still doing variations on Smalltalk or Simula.)

Variations on type inference are still playing out; the variation I would single out the most is Wadler and Blott's type class mechanism for resolving overloading, which was later discovered to offer very powerful mechanisms for programming at the type level. The end to this story is still being written.

Norman Ramsey
+1 Static type systems are a huge *huge* step in software development. I couldn't agree with this answer more.
Jeremy Powell
+74  A: 

Here's a plug for Google map-reduce, not just for itself, but as a proxy for Google's achievement of running fast, reliable services on top of farms of unreliable, commodity machines. Definitely an important invention and totally different from the big-iron mainframe approaches to heavyweight computation that ruled the roost in 1980.

Norman Ramsey
map-reduce isn't an invention of Google at all.
akappa
I'm a functional programmer. My first language was APL. Your point, exactly?
Norman Ramsey
So (mapcar f l) and (reduce f l) in Lisp automatically run on arbitrary numbers of commodity machines, handling all intercommunication, failures, and restarts?
Jared Updike
The Google map-reduce doesn't have much at all to do with functional map-reduce.
Wahnfrieden
A: 

Computer Graphics, Special Effects, and 3D Animation

Gordon Bell
All available in the 60s and 70s. Texture mapping, for example, is from 1974.
Andrew Dalke
+8  A: 

Flying cars and hoverboards. Oh wait, those haven't been invented yet. But by 2015, we have to have them. Otherwise Back To The Future 2 will have been a big lie!

Kip
I like your humor (as in http://stackoverflow.com/questions/164432#164556) +1
VonC
Thanks! I guess not as many people liked the humor here though, since i'm still at 0
Kip
@Kip; when Doc is about to try the time machine for the first time, and travel "into the future", he's aiming for 25 years ahead of 1985... which is to say, now. I want my Mr. Fusion.
Dean J
+4  A: 

Podcasting It allows for an informative way to distribute information and debate. I find it to be more interactive then standard interviews but have less noize then blog comments.

Jared
Harder to examine, though, and requires extra equipment (earphones) to avoid bothering cow-orkers.
Adriano Varoli Piazza
+4  A: 
VonC
Again, please check out what Engelbart demoed in 1968 (including live video chatting and screen sharing). IOW, guessing really doesn't work as well as looking things up. This is why most people make weak assumptions about when things were invented.
Alan Kay
I got smacked in the face by Alan Kay! I will never wash that cheek again ;) I guess my answer only contains two post-1980 bits of "invention" instead of 3 (even if their concept was around before): IRC (1988) and webcam (1991).
VonC
@VonC: How does IRC count for anything? It's a real-time chat medium -- an incremental refinement of something I'd been using long before IRC existed.
JUST MY correct OPINION
A: 

DOS. I'm not a DOS fan, but thanks to DOS and the IBM-PC computers are what they are today (for better or worse).

Ubersoldat
1966: http://en.wikipedia.org/wiki/Dos
some
+52  A: 

Tagging, the way information is categorized. Yes, that little text boxes under the question.

It is amazing that it took about 30 years to invent tagging. We used lists, tables of contents, things which are optimized for printed books.

However 30 years it is much shorter than the time people needed to realize that printed books can be in smaller format. So people can keep books in hands.

I think that tagging concept is underestimated among core CS guys. All researches focused on natural languages processing (top-down approach). But tagging it is first language which computers and people understand well. It is bottom-up approach to make computers use natural languages.

Greg Dan
Agreed - this correlates with my submission that the only new thing I can think of is syntactic markup to query among many domains - but you stated it better.
le dorfier
Check out Engelbart ca 1962-72
Alan Kay
For me tagging is very much like early search engines that used meta=keywords tag (that's post-80's too, I'm just making argument that tagging isn't worth mentioning).
porneL
While tagging in computing is relatively new approach, tagging is also a concept inherited from books; in books, it's called indexing.
Domchi
libraries have been using "tags" since... well I don't know but since a long time. Think about the *book cards* (sorry, I'm not sure how they're called in English) tagged "books about xxx".
nico
A: 

The Eclipse memory Analyzer:

and it's of use of the Lengauer-Tarjan dominator tree algorithm for memory usage analysis.

kohlerm
porneL
A: 

Digital music synthesizers.

I think, the whole music scene was affected by the availability of cheap polyphonic synths. The early polyphonic synths where effectively multiple analog synths (discrete or using CEM or SSM chips). They were both expensive and very liited. During the 80's, the first digital systems arrived (I am not sure, but I think Kurzweil was one of the first). Today, mostly all are digital - even the analog ones are typically "virtual anlog".

regards

EDIT: oops - I just found out that the CMI fairlight was invented in 1978. So forget the above - sorry.

blabla999
+3  A: 

The Eclipse IDE

Bringing an Smalltalk like IDE to the masses ;)

kohlerm
not just smalltalk like, visual age which become eclipse, written in smalltalk.
MkV
So, a reimplementation of an Alan Kay/Xerox idea from 1976?
Charles Stewart
+25  A: 

Shrinkwrap software

Before 1980, software was mostly specially written. If you ran a business, and wanted to computerize, you'd typically get a computer and compiler and database, and get your own stuff written. Business software was typically written to adapt to business practices. This is not to say there was no canned software (I worked with SPSS before 1980), but it wasn't the norm, and what I saw tended to be infrastructure and research software.

Nowadays, you can go to a computer store and find, on the shelf, everything you need to run a small business. It isn't designed to fit seamlessly into whatever practices you used to have, but it will work well once you learn to work more or less according to its workflow. Large businesses are a lot closer to shrinkwrap than they used to be, with things like SAP and PeopleSoft.

It isn't a clean break, but after 1980 there was a very definite shift from expensive custom software to low-cost off-the-shelf software, and flexibility shifted from software to business procedures.

It also affected the economics of software. Custom software solutions can be profitable, but it doesn't scale. You can only charge one client so much, and you can't sell the same thing to multiple clients. With shrinkwrap software, you can sell lots and lots of the same thing, amortizing development costs over a very large sales base. (You do have to provide support, but that scales. Just consider it a marginal cost of selling the software.)

Theoretically, where there are big winners from a change, there are going to be losers. So far, the business of software has kept expanding, so that as areas become commoditized other areas open up. This is likely to come to an end sometime, and moderately talented developers will find themselves in a real crunch, unable to work for the big boys and crowded out of the market. (This presumably happens for other fields; I suspect the demand for accountants is much smaller than it would be without QuickBooks and the like.)

David Thornley
+49  A: 

Google's Page Rank algorithm. While it could be seen as just a refinement of web crawling search engines, I would point out that they too were developed post-1980.

Bill the Lizard
"Just a refinement" is often an oxymoron. In this case, the refinement is the technology. The internet was a much scarier place before google brought ought that page rank algorithm (and delivered the results quickly and without page clutter, and all the other dredge that we use to have to suffer through to use other search engines in the past).
David Berger
i don't think you know what an oxymoron is.
Jason
Do you remember altavista and that little unknown company: yahoo?
voyager
@voyager: Hotbot and Lycos weren't bad, either.
Dean J
@Jason it's a **non-contradictory oxymoron**
Martin
@martin it's a **non-oxymoron oxymoron**. contradiction is in the definition: http://ninjawords.com/oxymoron
Jason
+4  A: 

Access to massive data.

The sheer size and scale of the data we have available these days is massive compared to what it used to be in the 80s. We've had to make a large number of changes to both our hardware and software to be able to store and display this stuff. One day, we'll actually learn how to qualify and mine it for something useful. Someday.

Paul.

Paul W Homer
A: 

Protected memory. Before protected memory if your program made a mistake, you could start executing code anywhere- virtually always hanging the entire machine. That's right, reboot time!

Low cost of hardware. My first computer cost $500 in 1978- a huge sum at the time. Lowering costs put PCs on every desk.

james creasy
Protected memory was invented in the 60s, at latest.
Darius Bacon
It amazes me how little modern programmers know about past programming. This is a classic example. What's next? Thin clients?
Stu Thompson
+2  A: 

Ctrl-C + Ctrl-V + Ctrl-X combo :)

Guido
Don't forget Ctrl-X! I love the mnemonic nature of these - V looks like the tip of a glue bottle (glue pastes) and X looks like scissors (scissors cut). And of course Copy starts with C (at least in English).
RedFilter
an even better invention is the clip history. It's unfortunate that this is not built into most operating systems. And also unfortunate that the external programs that supply this functionality have such appalling interfaces and poor integration into the OS
Breton
That too comes from Xerox PARC: http://en.wikipedia.org/wiki/Ctrl-C
some
+1 for OrbMan's comment!
John Nolan
@Guido +1 for that! Sorry for the editing, hope you don't mind.
Secko
A: 

I'm not qualified to answer this in the general sense, but restricted to computer programming? Not much.

Why? I've been thinking about this for a while and I think we lack two things: a sense of history and a way to objectively judge everything we've produced. This isn't true in all cases but is in the general.

For history, I think it's just something not emphasized enough in popular writing or computer science programs. Take language features, for example. A canonical source might be HOPL, but it's definitely not common knowledge among programmers to be able to mark the point in time or in which language a feature like GC or closures first appeared. And of course after that there's knowledge of progression over time: how has OOP changed since Simula? Compare and contrast our sense of history with that of other fields like maybe political science or philosophy.

As for judgement, this is really a failure on our part to seek objective measures of success. Given foobar, in what measurable way has it improved some aspect in the act of programming where foobar is any of design patterns, agile methodology, TDD, etc etc. Have we even tried to measure this? What do we even want to measure? Correctness, programmer productivity, code legibility, etc? How? Software engineering should really be picking away at these questions, but I've yet to see it.

A: 

Natural Language Processing. The first time I encountered this was in the early 1990s with a program from Symantec called Q&A that let you query the database by typing English queries. I am still impressed by it to this day.

RedFilter
There was a lot of invention in natural language processing before 1980.For just one example Try looking up Terry Winograd in Wikipedia.
Walter Mitty
+2  A: 

Design Patterns which brought computer science closer to computer engineering. GPS and internet address lookup for location based interactions. Service Oriented Architecture (SOA).

jeffD
A: 

It's a little thing i like to call the internet

Harry
That little thing existed before 1980: http://en.wikipedia.org/wiki/History_of_the_Internet
some
ok fine! (adding more to comment inorder to submit!)
Harry
+24  A: 

me (1981)

Chris Ballance
Windows ME came later ;-)
splattne
you? (ok10char!!!)
hasen j
+1 for the "right" birth year
Jared Updike
+1!!! I was "invented" that year too :D
Andrei Rinea
+1  A: 

Multi-Agent Systems.

You can go back to distributed artificial intelligence roots, and I think still stay safely this side of the 80s.

There's many components to multi-agent systems, with lots of studies going into speech acts or cooperation, so it's rather difficult to point and say "See, here, this is different, innovative and important!" But I'll try anyway. :-)

I think the Belief-Desire-Intention model is particularly noteworthy. Agents have internally constructed models of the world. They have particular desires, or goals, and formulate plans on how to interact with the world as they know it to achieve those goals, thereby making up intentions.

Or, to use an analogy, the characters in Tron, the movie, have a certain understanding of how the world around them worked. They did not KNOW the whole world, and they could be mistaken about parts of it. But they had desires and goals, and they came up with plans to try to further that. If you saw Tron, I'm sure you'll get the analogy.

It hasn't had much an impact on computing YET. But, see, things that have impact on computing seems to take a few decades anyway. See: OOP, GC, bytecode compilation.

Daniel
A: 

I think part of the problem with these answers is they are either not well researched or are attempting to a new implementation or some technology that has seen significant "improvements." However, this is not a significant invention. For instance, any talking about functional programming or object oriented programming just fails; most of these ideas have been circulating since before most of the participants of SO were born.

BobbyShaftoe
+1  A: 

The massive increases in processor speed that have occurred over the last 30 years can't be overlooked. All manner of clever ideas such as pipelining and pre-emptive branching, as well as improvements in electronic side of processor design, mean that programmers today can worry more about the design and maintainability of their programs and worry less about counting clock-cycles.

open-collar
that's not invention, that is evolution (making things bigger, better, badasser)
steffenj
Moore's Law has been in effect since before 1980
Stu Thompson
+1  A: 

StackOverFlow.com

Brendan Enrick
Voted up because it's funny. :)
ionut bizau
Voted down because it's not. :(
sharkin
Didn't do anything because i agree with R.A :|
Ólafur Waage
Ass kisser! .
Stu Thompson
Geez, this is timely. Just six months late...
duffymo
A: 
  1. The mouse - There have been posts about human interaction. To me, the mouse was the gateway to human interaction. Without it, we'd still be typing and not clicking in dragging, even with our fingers.

  2. GUI - Complimented the mouse perfectly. I work in an environment where an as400 is the backend of one of our major apps. Yeah.. Interesting stuff but it just reminds me of the screens 'Bill Gates' is working in in the movie 'Pirates of Silicon Valley' even though that's not what it was. To me, 1 and 2 are the reason anybody, including grandpas and grandmas can use a computer.

  3. Excel / spreadsheets - Someone mentioned this before but it's work mentioning again. It's so user friendly and is a great entry point for non-technical users to try their hand at simple programming concepts when performing calculations on cells. Granted it came out before 1980, but the versions post 1980 are when the technology in spreadsheets evolved.

  4. Internet (of course) - Not sure how people wrote code without it! Don't flame me for repeating because this belongs on every list.

  5. INTELLISENSE - LOVE IT LOVE IT LOVE IT!!!!

asp316
Mouse: Engelbart, 1968. GUI: was in Sutherland's Sketchpad, 1963. Internet: 1969.
Andrew Dalke
Perhaps strickly speaking they were invented then but they weren't in use extensively in the 60s. I thought Al Gore invented the internet? ;)
asp316
+27  A: 

Modern shading languages and the prevalence of modern GPUs.

The GPU is also a low cost parallel supercomputer with tools like CUDA and OpenCL for blazing fast high level parallel code. Thank you to all those gamers out there driving down the prices of these increasingly impressive hardware marvels. In the next five years I hope every new computer sold (and iPhones too) will have the ability to run massively parallel code as a basic assumption, much like 24 bit color or 32 bit protected mode.

Jared Updike
Try it. You won't like it. Multi-core systems are much faster for most real-world problems. YMMV. Good for graphics, and not much else.
xcramps
There's a reason they're called GPUs and not PPUs... (Parallel processing units). Most people don't have the patience and/or skills to write good code for them. Though there is an increasing amount of research projects that are exploring using GPUS for non graphics purposes.
RCIX
I tried it. I liked it. I can run all of my Matlab code on the GPU, with no source code modifications apart from a few typecast changes which you can do with a search'n'replace. Google "Matlab GPU computing".
Gravitas
I agree with the OP. The programmable pipeline, while something we now might take for granted, completely changed the world of graphics, and it looks like it might continue changing other parts of the programming world.@xcramps: I think I'm missing something; last I checked, GPUs were multi-core systems. Just with a lot more cores. Kind of like... supercomputers. But I guess those aren't really being used for anything in the real-world...
Perrako
+3  A: 

Open PC design that led to affordable components (except from Apple :-) and competition that drove innovation and lower prices. This caused the big change from the user going to the computer -- where there was a terminal to use -- to the computer coming to the user and appearing at home and even in ones lap.

Tom A
And keep in mind that because of this Macs are now on the same architecture as everybody else too ;-).
inkredibl
Multiple manufacturers were delivering S100 bus designs to users in 1976.
Dour High Arch
+5  A: 

Virtual Worlds in which you are represented by a virtual alter ego (aka Avatar), for socializing and roleplaying.

Most commonly referred to as MMOs - Massive(ly) Multiplayer Online. Some popular examples include World of Warcraft, Everquest, Second Life.

PS: no, they still don't require the heavy headgear as typically depicted in geek movies of the 80s. It's a shame....

steffenj
+4  A: 

Premise: virtually no new inventions since 1980.

The first thing to do is define invention, or else you'll get off on the wrong track. The second definition of invention from Dictionary.com says:

U.S. Patent Law. a new, useful process, machine, improvement, etc., that did not exist previously and that is recognized as the product of some unique intuition or genius, as distinguished from ordinary mechanical skill or craftsmanship.

Thus, since 1980, there have been very few new inventions in computing. What has there been? Obviously there has been large amounts of new technologies and new things coming about, but what are they?

We aren't inventing any more, we are improving what primarily exists already.

A simple example:

The CD, or compact disk, was first started in 1977 though they weren't accepted by industry until 1982. At this time the first factory for pressing CDs just came into readiness. Eventually, by 1985, the CD-ROM (Read-Only Memory) was accepted as a medium. The CD-RW followed 5 years later. (Source: Wikipedia)

Now what? Well, given that we have larger hard drives (still just improvement on the paradigm) we need more space to be able to supplant the VHS market and make videos compatible with computers. Thus came about the DVD, though I am cutting out many improvements to the existing CD technology.

The DVD came about, was "invented", during the year of 1995. (Source: Wikipedia)

Since then we have had:

  1. Writable, and ReWritable DVDs
  2. Dual-layer DVDs
  3. Triple- and Quad-layer DVDs (unreleased though feasible through a simple driver revision)
  4. HD-DVD
  5. Blu-ray Disc

Obviously this list isn't all inclusive. But spot the new invention, remember the definition I gave above, in that list. You can't! They're all just variations on the concept of an optical disc, all just variations on the same hardware, and all just variations on existing software.

WHY?

Cost. See, it's cheaper economically to make incremental improvements to an existing product. If I can sell you a HD DVD or a Blu-ray Disc because you believe it to be necessary or cool, then I have no need to release my plans for the Triple or Quad layer DVDs. In fact, I can charge you through the nose just to get the new technology because you are an early adopter and you need my "new and improved!" hardware.

This is called either marketing, or product relations.

But what about software?

What about it? Pre-1980 there was a lot of software inventiveness going on, but since then it has mostly just been improvements on what already exists or reinvention of the wheel. Look at any OS or office package to see this.

Conclusion

As far as I'm concerned, there have been virtually no new inventions in the past 29 years. I could wax long and cross a great many industries, but why should I bother? Once you start thinking about it, and start comparing an "invention" to a prior, similar product ... you'll find it is so similar that it isn't even funny. Even the internal combustion engine has been around since 1906 with no new inventions in that field since then; many improvements and variations of this "wheel" yes, but no new inventions.

Not even that new weapon America deployed in Iraq--the one that uses microwaves to make a person feel shocked like they touched a lightbulb--is new. The same idea was used in security systems, then classified and taken off the market, with ultrasound to make an intruder feel physically ill. This is a directed form of the weapon with a different wavelength and application, not a new invention.

The Wicked Flea
It's funny you mentioned U.S. Patent Law in your definition, because if you look at patents, especially software patents - you have quite a lot of "inventions" since 1980 ;-), it's a shame that they aren't real inventions just some kind of parodies, just like you said...
inkredibl
Software patents are mostly just "conceptual patents", which cover an umbrella of regions. These sorts of patents are an abuse of the system, in my opinion. They also aren't inventive at all. >_<
The Wicked Flea
The first 'computers' where really just an improvement on electronic calculators. The first electronic calculators were really just an improvement on mechanical adding machines. The first mechanical adding machines were really just an improvement on an abacus. The first abacus was really just an improvement on using your fingers. The first fingers were really just an improvement on legs, and the first legs were really just an improvement on wriggling around like a worm.
Kirk Broadhurst
In a similar analogy, the Great Wall of China was never 'built'. Individual bricks were laid that ever-so-slowed improved on what was already there. And every day people would say 'it's only slightly longer than it was yesterday, that's not exciting'.
Kirk Broadhurst
I don't see why, but you seem to have divorced development, effort, and planning from their association; it took great planning to get the Great Wall of China built. And the abacus wasn't an improvement on using fingers, unless you used base-6 notation with both hands and feet! An abacus can represent far larger numbers. Find me some "missing links" that were the surviving transitional fossils between these *distinct inventions* ... because last I saw people still use fingers, abacuses, calculators, and computers (yet more sophisticated an adding machine).
The Wicked Flea
The Wicked Flea
+6  A: 

Touchscreens and Motion Sensing interfaces for human computer interaction.

For example:

  • Touchscreens for PDAs, iPhone or Nintendo DS
  • Motion Sensing, Nintendo Wii Controller or (to a lesser degree) SixAxis controller for Playstation 3.

Only question is ... are these technologies really post-80s?

steffenj
Touchscreens are 1960s era in origin, and part of the PLATO system in 1972. One of the games on PLATO using touchscreen? "squish the bug"
Andrew Dalke
Bear in mind that the PLATO terminal _alone_ cost several tens of thousands of dollars, and needed to be connected to a CDC Cyber mainframe, at $10 million or so. And these were 1960's era dollars, so multiply everything by at least 10, and probably closer to 100.
Eric Brown
+4  A: 

Electrically Erasable Programmable Memory, generalized into non volatile read/write memory the most well known and ubiquitous currently being Flash. http://en.wikipedia.org/wiki/EEPROM lists this as being invented in 1984.

By giving the storage medium the same general physics, power requirements, size and stability as the processing units we remove this as a limiting factor in designs for where we place processors. This expands the possibilities for how and where we place 'intelligence' to such a plethora of smart devices (and things that would previously never have been candidates for being considered smart at all) that we are still taken up in the surge. Mp3 players are really just a fraction of this.

ShuggyCoUk
+7  A: 

First of all, let the highest choir of angels sing your praise, Alan, for your undying contributions to the field that became my own passion as well. I don't think I can express my respect any more clearly than that, so I won't, and move on to your question.

The pre-1980 days were, of course, the glory days of Xerox PARC. Back when the GUI, the mouse, the laser printer, the internet, and the personal computer were all being created. (Seeing as I'm too young to have been alive back then, and you were pretty much working on inventing all of those, I can't tell you anything about 1980 that you don't already know, so let's move on.)

The thing is, though, that the pre-1980 days were a lot more vibrant in terms of truly disruptive new technologies. That's the way it is with any new field -- hwo many game-changing technology advances have you seen in railroads in the past 100 years? How many have you seen in lightbulbs? In the printing press? Once something ignites a hype in the right circles, there is an explosive period of invention, followed by a long period of maturing. After that, you're not going to see the same kind of completely radical changes again UNLESS the basic circumstances change.

Luckily, that might be happening in a number of fields, and it has already happened in a few others:

  • Mobility - smart phones bring computing to a truly portable platform, which will soon include location-based services and proximity-based ad-hoc networks. It's a completely new paradigm that's potentially as game-changing as the GUI has been

  • The WWW (HTTP, HTML and DNS) has already been mentioned and is an obvious addition to the list, since it is enabling global, inexpensive, mainstream rich communication across the globe - all thanks to a computing platform

  • On the interface side, both touch, multitouch (Jeff Han comes to mind) and the Wiimote need mentioning. Currently, they are basically curiosities, but so were the early GUIs.

  • OOP design patterns -- higher level solutions as best practices to hard problems. Depending on your definition of 'computing', it may or may not belong on the list, but if you count OOP as a significant advance pre-1980 (I certainly do), I think design patterns and the GoF deserve a mention too

  • Google's PageRank and MapReduce algorithms - I am pleased to notice I wasn't the first to mention them, and seriously --- where would the world be without the principles of both of them? I vividly remember what the world looked like before them, and suffice it to say Google really IS my friend.

  • Non-volatile memory -- it's on the hardware side, but it is going to play a significant role in the future of computing - making bootup times a thing of the past, for example, and enabling us to use computers in entirely new ways

  • Semantic (natural language) search / analysis / classification / translation... We're not quite there yet, but companies like Powerset give the impression that we're on the brink.

  • On that note, intelligent HTMs should be on this list as well. I am yet another believer in Jeff Hawkins' model and approach, and if it works, it will mean a complete redefinition of what computers can do, what it means to be human, and where the world can go from here. Creating a real intelligence in that way (synthetically) would be bigger than anything the human race has accomplished before.

  • GNU + Linux

  • 3D printing / rapid prototyping (and, in time, manufacturing)

  • P2P (which also lead to VoIP etc.)

  • E-ink, once the technologies mature a bit more

  • RFID might belong on the list, but the verdict is still out on that one

  • Quantum Computing is the most obvious element on the list, except we still haven't been able to get enough qubits to play along. However, my friends in the field tell me there's incredible progress going on even as we speak, so I'm holding my breath for that one.

  • And finally, I want to mention a personal favourite: distributed intelligence, or its other name: artificial artificial intelligence. The idea of connecting a huge number of people in a network and allowing them access to the combined minds of everyone else through some form of question answering interface. It's been done a number of times recently, with Yahoo Answers, Askville, Amazon Mechanical Turk, and so on, but in my mind, those are all missing the mark by a LOT... much like the many implementations of distributed hypertext that came before Tim Berners-Lee's HTML, or the many web crawlers before Google. Seriously -- someone needs to build an search interface into 'the hive mind' to blow everyone else out of the water. IMHO - it is only a matter of time.

Jens Roland
Design patterns were around earlier, just not applied to software engineering (Christopher Alexander wrote about them as applied to architecture). Arguably by their nature, they are discovered, not invented. That the GoF was able to write about them meant that they were pre-existing.
Adam Jaskiewicz
I know about architectural design patterns, but my point is still valid. The car was still invented in the late 1800s even if the locomotive existed before then. Software patterns are a different beast.
Jens Roland
Lightbulbs? How about LEDs? They used to be only green and red. The blue LED was the holy grail just 15 years ago, now they are everywhere. I remember seeing white LEDS for the first time in a Hewlett-Packard lab in 1998. The tungsten lightbulb is about to be outlawed due to its power consumption. Right now, we are in the biggest technology change of illumination since neonlights.
Guge
Linux is a significant new invention?! GNU?! Linux is a monolithic kernel ('50s or '60s) that is UNIX-like ('70s). The GNU project as a whole aims to catapult us into the future by providing a free clone of a '70s operating system. There's nothing new or innovative in either (aside from the chutzpah of their proponents claiming innovation).
JUST MY correct OPINION
Wikipedia says MVC was around in 1979, and it's definately a pattern although perhaps not formally described at the time in Alexandrian form. [http://en.wikipedia.org/wiki/Model–View–Controller](http://en.wikipedia.org/wiki/Model–View–Controller)
cartoonfox
+4  A: 

Optical computing. Seems like it should have been around longer but I can't currently find any references pre-dating 1982 or so (and the relevant piece of technology, the optical transistor, didn't pop up until 1986).

nezroy
My dad knows a man that patented a holographic computer, which was 100% holographic. I've no idea how it worked, but it was supposedly an extremely fast system.
The Wicked Flea
+69  A: 

BitTorrent. It completely turns what previously seemed like an obviously immutable rule on its head - the time it takes for a single person to download a file over the Internet grows in proportion to the number of people downloading it. It also addresses the flaws of previous peer-to-peer solutions, particularly around 'leeching', in a way that is organic to the solution itself.

BitTorrent elegantly turns what is normally a disadvantage - many users trying to download a single file simultaneously - into an advantage, distributing the file geographically as a natural part of the download process. Its strategy for optimizing the use of bandwidth between two peers discourages leeching as a side-effect - it is in the best interest of all participants to enforce throttling.

It is one of those ideas which, once someone else invents it, seems simple, if not obvious.

Kief
True, ahtough while BitTorrent may be somewhat different/improved, the *significant new invention* really should be P2P-distribution, rather than any specific implementation like BitTorrent.
Ilari Kajaste
I disagree. P2P is not at all new, it's older than USENET.Pre-bitTorrent "P2P" apps for the desktop (Kazaa and the like) are simply repacking of the client-server concept, adding a dynamic central directory of servers. Each "peer" client connects to a single other "peer" server to transfer a file. The fact that a single node does both is old hat (at least for pre-Windows systems). The bitTorrent protocol is (AFAIK) a completely new way to transfer files, which leverages multiple systems to transfer a file between one another in a truly distributed manner.
Kief
-1. In reality torrents are much slower than direct download, so the practical applications just don't support the theory. In reality you'll always have more leachers than seeders. Most ISPs throttle torrent traffic lately, and do heavy data shaping to detect torrents (encrypted or not).
JL
@JL: In theory, direct download is faster, but not in practice. With one seeder and one leacher, there shouldn't be any difference. As soon as you add another leacher, that leacher can start taking pieces from whoever has a faster connection (even if the client with the faster connection doesn't have the complete file). With a direct download, to take advantage of the faster connection, you would first have to wait for the client to finish the download before you could start.
Peter Di Cecco
I think the better question becomes how much bandwidth do you save by hosting a torrent and seeding it with what would have been a direct download box. Only companies like Blizzard know that now, and I havent seen them talk numbers.Without a 'super seed' torrents will rely on users to seed, which just doesnt work with async connections and people not wanting to leave their computer on and upstream saturated.
semi
@JL: torrents are slower than direct download? My "practical" experience says different; try going to download Eclipse both ways.
Dean J
+2  A: 

Games With a Purpose - Collective intelligence tools like Luis von Ahn and his team are developing might have been a dream before 1980, but there wasn't a widely deployed network with millions of people available and a need (e.g. reCAPTCHA) to actually make it happen.

Jeff Moser
+3  A: 

IP Multicast (1991) and Van Jacobsen's Dissemination Networking (2006) are the biggest inventions since 1989.

James Cape
A: 
A: 

Software Patents

joeforker
Lol it will have an impact on the development for sure.. he didn't say that it has a positive impact so 1+ for that :D
Nils
+2  A: 

The first true multimedia personal computer, the Amiga: the first 32-bit preemptive multitasking personal computer, the first with hardware graphics acceleration, the first with multichannel sound and in many ways a far more useful and capable machine than the multicore, multigigahertz Windows boxen that proliferate today.

+2  A: 

The Bizarre style of development (as described in http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ by Eric S Raymond). Raymond credits Linus Tourvald's release of the Linux kernel in 1991 as the first use of the Bizarre style of development.

fd
Is this supposed to be funny? It doesn't quite work.
sep332
Do you mean **Bazaar**?
Barry Brown
You are correct, but I actually quite enjoyed the typo, so I left it in.
fd
+2  A: 

“American’s have no past and no future, they live in an extended present.” This describes the state of computing. We live in the 80’s extended into the 21st century. The only thing that’s changed is the size. Alan Kay

Source: Alan Kay: Is Computer Science an Oxymoron?

Bahaa Zaid
+1  A: 

The successful integration of different programming paradigms into single programming environments.

The exemplar of this (for me) is the Mozart/Oz programming system, which integrates functional, OO, logic, concurrent and distributed programming mechanisms into a coherent whole. There are other examples though.

+2  A: 

The memristor.

While the idea is not newer than 1980, I believe a working model was not created until 2008. Should it make it past R&D, it will be the most significant advance in computer hardware since the transistor; at the very least, obviating secondary memory.

Ellery Newcomer
A: 

In order to start thinking about this, I need a model for what "innovation" means.

The best model I've seen is The Technology Adoption Life Cycle. You can get an overview at this Wikipedia Article.

Using this model, I began to ask myself... at what stage of the life cycle is software itself? We can think of "software" as a distinct technology from machinery going all the way back to Babbage, or perhaps more precisely, to Lady Ada Lovelace.

But it surely remained at the very early pioneering stage at least until about 1951. That's the year programmed computers "went commercial" in terms of selling a model for a computer product, and building lots of units of that model. I'm thinking of the machine that Univac sold to the Census Bureau.

From 1951 to about 1985, software innovations were numerous. They mostly had to do with extending the span of computing to an ever wider field of endeavor. In parallel, mass marketing and mass production kept bringing the cost of entry down till the Apple and IBM-PC made a programmable device a commonplace appliance.

Somewhere between 1980 and 1985, I'd say that software passed from the Innovator's domain to the "Early majority" domain. Sorry, guys, but that makes all of you that participated in MS-DOS, the Mac, Windows, C++ and Java eraly majority rather than innovators. That doesn't preclude your having done significant innovation on your own turf and in your own projects. It just means that the field itself had moved on from the earliest stage.

While the Internet's precursor had been around since the 1970s, it wasn't until Al Gore invented the internet (sorry) that everybody hooked up. At that stage, software passed from the early majority to the late majority. This shift was subtle, as the top of the bell curve suggests. Not every shop moved from early majority to late majority at the same time.

I don't think software has quite passed into the "laggard" stage yet, but I think that real innovators are tackling the problem of producing progress on different fronts today.

Two fronts that I can think of are Bioengineering and Information Appliances. Both of these fields require software, but the main thrust is not software innovation. It's applying software to uncharted territory. There are probably lots of other fronts that I'm not even aware of.

Walter Mitty
+5  A: 

BitTorrent.

hasen j
+22  A: 

DNS, 1983, and dependent advances like email host resolution via MX records instead of bang-paths. *shudder*

Zeroconf working on top of DNS, 2000. I plug my printer into the network and my laptop sees it. I start a web server on the network and my browser sees it. (Assuming they broadcast their availability.)

NTP (1985) based on Marzullo's algorithm (1984). Accurate time over jittery networks.

The mouse scroll wheel, 1995. Using mice without it feels so primitive. And no, it's not something that Engelbart's team thought of and forgot to mention. At least not when I asked someone who was on the team at the time. (It was at some Engelbart event in 1998 or so. I got to handle one of the first mice.)

Unicode, 1987, and its dependent advances for different types of encoding, normalization, bidirectional text, etc.

Yes, it's pretty common for people to use all 5 of these every day.

Are these "really new ideas?" After all, there were mice, there were character encodings, there was network timekeeping. Tell me how I can distinguish between "new" and "really new" and I'll answer that one for you. My intuition says that these are new enough.

In smaller domains there are easily more recent advances. In bioinformatics, for example, Smith-Waterman (1981) and more especially BLAST (1990) effectively make the field possible. But it sounds like you're asking for ideas which are very broad across the entire field of computing, and the low-hanging fruit gets picked first. Thus is it always with a new field.

Andrew Dalke
+28  A: 

I think we are looking at this the wrong way and drawing the wrong conclusions. If I get this right, the cycle goes:

Idea -> first implementation -> minority adoption -> critical mass -> commodity product

From the very first idea to the commodity, you often have centuries, assuming the idea ever makes it to that stage. Da Vinci may have drawn some kind of helicopter in 1493 but it took about 400 years to get an actual machine capable of lifting itself off the ground.

From William Bourne's first description of a submarine in 1580 to the first implementation in 1800, you have 220 years and current submarines are still at an infancy stage: we almost know nothing of underwater traveling (with 2/3rdof the planet under sea, think of the potential real estate ;).

And there is no telling that there wasn't earlier, much earlier ideas that we just never heard of. Based on some legends, it looks like Alexander the Great used some kind of diving bell in 332 BC (which is the basic idea of a submarine: a device to carry people and air supply below the sea). Counting that, we are looking at 2000 years from idea (even with a basic prototype) to product.

What I am saying is that looking today for implementations, let alone products, that were not even ideas prior to 1980 is ... I betcha the "quick sort" algorithm was used by some no name file clerk in ancient China. So what?

There were networked computers 40 years ago, sure, but that didn't compare with today's Internet. The basic idea/technology was there, but regardless you couldn't play a game of Warcraft online.

I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

Historically, we have never been able to "find them" that close from the idea, that fast. I think the cycle is getting faster, but computing is still darn young.

Currently, I am trying to figure out how to make an hologram (the Star Wars kind, without any physical support). I think I know how to make it work. I haven't even gathered the tools, materials, funding and yet even if I was to succeed to any degree, the actual idea would already be several decades old, at the very least and related implementations/technologies have been used for just as long.

As soon as you start listing actual products, you can be pretty sure that concepts and first implementations existed a while ago. Doesn't matter.

You could argue with some reason that nothing is new, ever, or that everything is new, always. That's philosophy and both viewpoints can be defended.

From a practical viewpoint, truth lies somewhere in between. Truth is not a binary concept, boolean logic be damned.

The Chinese may have come up with the printing press a while back, but it's only been about 10 years that most people can print decent color photos at home for a reasonable price.

Invention is nowhere and everywhere, depending on your criteria and frame of reference.

Sylverdrag
+1. Take a look for instance at the iPad ;) See http://stackoverflow.com/questions/432922/significant-new-inventions-in-computing-since-1980/2618681#2618681
VonC
If only there was a fav. answer tag...if only there was an option to give 2 upvotes...
Tshepang
Great answer. Maybe we should be asking then, what *new ideas* have there been in the past 30 years (not new products/inventions). And since it's too hard to say whether or not they'll be "significant" or revolutionary before they're even built.... maybe we can speculate and then decide where to spend more energy.
Mark
There have been countless amazing new ideas in the last 30 years, but there hasn't necessarily been time to see which ones matter. Pick any field of computing and just flick through the research released in the last year, and you'll find no shortage of new ideas, from small improvements to complete overhauls. However, the 1980s and before seem so revolutionary and packed because those ideas have now come to fruition and are ubiquitous, so they seem significant. We'll be having this same discussion in 30 years, when the ideas from now have boiled down into wonderful inventions.
Perrako
A: 

Top ten software engineering ideas / picture

Comptrol
Most of the books are BS. Where is OOP and the patterns book.
Nils
+9  A: 

Reorganization is what we need, not reinvention.

We have all the hardware and software components we need right now to do amazing things for years to come.

I believe there is a disease in the Sciences, where ever participant is always trying to invent something new to distinguish themselves from others. This is in contrast to doing some of the messy work of cataloging or teaching older works.

People who build 'new' things are generally considered of a higher pedigree than people who reuse existing and something almost ancient works. (Ancient to say a 20 year old to whom something like say Lisp was made more than double their life time in the past. 1958)

Good old ideas need to be resurrected and propagated far and wide, and we need to stop trying to build businesses or programmer movements that effectively trample old works and systems in power-plays to be the next new thing-when in fact most 'new shiny' things are just aspects of old ideas resurrected.

So the iPad is rather a reorganization then a reinvention..
Nils
Yes, it is. The O/S is taken from the iPhone and the concepts of tablet computers and systems designed for consumption (think: set top boxes and java applets) are not new either.
ConcernedOfTunbridgeWells
@Nils: the iPad is an Apple Newton device, 25 years later. http://en.wikipedia.org/wiki/Newton_%28platform%29
Dean J
+2  A: 

This is a negative result, which is odd as a 'Fundemental innovation', but I think applies since it opened new areas of research, and closed off useless ones.

The impossibility of distributive consensus: PODC Influential Paper Award: 2001

We assumed that the main value of our impossibility result was to close off unproductive lines of research on trying to find fault-tolerant consensus algorithms. But much to our surprise, it opened up entirely new lines of research. There has been analysis of exactly what assumptions about the distributed system model are needed for the impossibility proof. Many related distributed problems to which the proof also applies have been found, together with seemingly similar problems which do have solutions. Eventually a long line of research developed in which primitives were classified based on their ability to implement wait-free fault-tolerant consensus.

Steve Steiner
A: 

The teevee tube box

Janie
A: 

I do not know if somebody has already answered, "machine learning" as a significant new development that is developing fast. With intelligent spam filtering, stock market predictions, intelligent machines like robots, ...

May be, machine intelligence might be the next big thing.

Alphaneo
Charles Stewart
+1  A: 

Sensor networks: very tiny (nano scale) computers form ad-hoc p2p networks and transmit "sensory" information.

3D printing: Star Trek replicator for physical objects (no Early Grey tea yet).

DNA computing: Massively parallel computing for some types of problems.

projectshave
A: 

I would vote, as a Debian user, for package management. It makes OSX and Windows 7 look like primitive amateurish playthings.

But since package management was already mentioned, I will vote for X. The network transparent window server has made a lot of applications possible. It's wonderful to be able to seamlessly summon programs running on different computers side by side on the same screen.

And that was a tad more impressive in the late 80s.

rbanffy
A: 

Let's see, Connection Machines (Massive Parallelism) for one.

Anyway, this whole question seems like an egoboo for Alan Kay since he invented everything.

MkV
A: 

The mathematics for quantum computing has been around since before 1980, but the hardware isn't here yet and may be physically and economically infeasible for many years to come.

Joe Chung
+1  A: 

Translation software with community support to make manual corrections and recommendations, followed up with an AI bot to form patterns to eventually distinguish and correctly predict ambiguity in different translations and contexts.

While it's true Google Translate might not be that beast, it is the mother, or perhaps the grandmother of a system just waiting to be developed.

If you think about it - textual language is really input to the brain, the eyes see the text and sends images to the brain, which then translates this into understanding.

While its true communication (especially human communication) is an advanced topic, the basics are input (with context) -> translation -> understanding.

Why do we still have no really good way to send emails to distant co-workers, or partners who don't speak our language? This is obviously the Phase 1.

Once this is complete, we can move onto stuff like real-time phone call translation.

Instead month after month our greatest intelectual assets are involved in other more crucial projects, like space research, and meteor detection, or trying to prove the Bible wrong (yawn).

How about we dedicate more time to basic practical communication?

JL
+1  A: 

Low cost/home computing. Something that (at least here in Blighty) wasn't really heard of until the early 1980s. Without home computing, how many people posting here would have got into computing as a career? Or even as a hobby1?

Myself, had my folks not got Clive Sincliar's humble rubber-keyed ZX Spectrum back in 1982/1983, I probably wouldn't be here now. And it wasn't just the Speecy: the C64, Vic-20, Acorn Electron, BBC A/B/Master, Oric-1, Dragon-32, etc. all fuelled the home computer market and made programmers out of every 8 year old boy and girl who had access to one.

If that wasn't a revolution in terms of computing and programming, I won't know what was...!

1 curious aside: what is the breakdown of hobbyists vs pro programmers on this site? I realise these stats aren't collated, but could be interesting to know.

Chris J
*Low cost/home computing* - revolutionary, yes, but it was essentially an economic sea change in computing, not an invention. Were there any particular siginificant inventions that made it possible?
Charles Stewart
+3  A: 

Well the World Wide Web has already been told, but more basically, I would say "DNS". Seems that it was invented in 1983 (http://en.wikipedia.org/wiki/Domain%5FName%5FSystem) and IMHO we can consider that it's the mandatory link between invention of the internet protocol and the capability to spread all over the world what is now called the web.

Still in the "network" section, I would add WIFI. It was invented in the 90's (but I agree it's not exactly "computing", but more related to hardware).

In a more strict "algorithmic" section, I think about turbocodes (dated 1993); some say it's only closing the limit defined by the Shannon signal theory, but wouldn't this argument reject all other answers to "everything was already in seed in Lovelace, Babbage and Turing writings" ?

On the field of cryptography, I would add the PGP program from P.Zimmermann (dated 1991), which brought a quite robust (at this time) free encryption program to the citizen, and contributed to shake a little the government's posture about encryption. In fact I think it was one of the factor of cryptography "liberalization", which was a prerequisite for developing e-commerce.

zim2001
A: 

The Personal Computer.

Hands down, the most important part of computing in the last thirty years is that everyone is now part of it. Computers for home use only date to 1977 or so, and widespread adoption took until well into the 80's. Now, kindergartens, senior centers, and every next door neighbor you'll ever have owns one.

Dean J
-1: Altair 8800, 1975. No post-1980 invention here.
Charles Stewart
+1  A: 

The rise of motion sensors in gaming which does away with the traditional game joysticks and lets the user very close to the game itself. This complements our ever changing urban landscape and lifestyle where we have limited physical activity. This advancement in gaming definitely induces atleast some physical activity while doing something that one enjoys. It is definitely better than doing same mundane reps at your gym.

rocknroll
+1  A: 

Augmented Reality. This hasn't really taken off yet, but as ideas go I think it is huge, from being able to paint virtual arrows on the ground to help you find your destination, to decorating everything around you with useful information or aesthetic fancies.

Imagine your phone ringing across the room, you look at it and a information bubble pops up above it to tell you who is calling. How cool would that be? AR will bring massive changes in the way we think about and interact with technology.

Haunted houses would probably get significantly scarier too.

I also wanted to mention Electroencephalography for brain-computer interfacing, but apparently this was first invented in the 1970's.

Sam
+1  A: 

USB Keys/Thumb drives

USB Keys were the effective replacement of the floppy, where the floppy was still superior to the CD or DVD in simple transfer.

burnt_hand
+1  A: 

The Internet.

That's it.

M28
Charles Stewart
+6  A: 

The iPad (released April 2010): surely such a concept is absolutely revolutionary!

alt text

No way Alan Kay saw that coming from the 1970's!
Imagine such a "personal, portable information manipulator"...


...

Wait? What!? The Dynabook you say?

alt text

Thought out by Alan Kay as early as 1968, and described in great details in this 1972 paper??

NOOOoooooooo....

Oh well... never mind.

VonC
See http://stackoverflow.com/questions/432922/significant-new-inventions-in-computing-since-1980/642098#642098 for a larger context illustrated by this answer.
VonC
Well surely the idea was around before (for example apple newton); however the technology now proceeded so far that it's possible to build a cheap (and great) consumer device.
Nils
A: 

I think the most concepts in computing have mostly been undergoing refinements, but there have been some new developments, particularly in distributed computing.

  1. Robustness against failure and defection, and failure recovery, ie. Paxos, Byzantine Fault Tolerance, etc.
  2. I know people have mentioned P2P, and that P2P communication was happening in the 70s, but with all due respect I don't think it was of the same nature as is commonplace today, with distributed hash tables, efficient dynamic ad-hoc networks, and most importantly, anonymity (ala Freenet, Tor).

The majority of work has been refinement, and while many modern systems are little better than the original concepts first described in the 60s or earlier, some are orders of magnitude better.

naasking
A: 

I'd have to say that the biggest invention in computing since 1980 is Moore's law. There were tons of really cool, innovative things created in the 1960s and 1970s - but they were insanely expensive one-off projects. And most of these projects are lost in the mists of time.

Today, the cool, innovative project gets a couple rounds of funding and is available on everybody's desktop or web browser in 6 months or so.

If that's not innovative, what is?

Eric Brown
Moore's law was actually coined in 1965.
Charles Stewart
True, but it didn't really kick in until after 1980. Z80's (1976) weren't *that* much cooler than an 8080 (1974); 8086s (in 1978) were nicer, but 68000s (1979/80) and subsequent CPUs were hands down superior.The "Killer Micros" really didn't take over until 1990 or thereabouts.
Eric Brown
A: 

I would say Linux and the reification of the worse-is-better philosophy, but you can argue that those are older. So I´d say: quantum, chemical, peptide, dna, and membrane computing, (re)factoring in a non ad-hoc fashion and automated, aspects, generic programming, some types of type inference, some types of testing,

The reason why we have no new ideas: sw patents (this comes from the late 60s ...), corporations and education.

Javier Diaz
+3  A: 

One thing that hasn't changed in mainstream computing is the hierarchical filesystem. That's a shame, IMO, since some work was being done in the late 1980s and 1990s to design new kinds of file systems more appropriate for modern, object-oriented operating systems -- ones which are OO from the ground up.

The OO operating systems tended to have flat object stores that were expandable and flexible. I think the EROS Project was one built around that idea; PenPoint OS was an 1990s object-oriented OS; and Amazon S3 of course is a contemporary, flat object store.

The are at least two ideas in OO, flat filesystems that I particularly liked:

  • The entire disk was essentially swap space. Objects exist in memory, get paged out when they are not needed, and brought back in when they are. There's no need for a hierarchical filesystem that's separate from virtual memory. Programs are "always running," in a sense.

  • A flat file/object store allows content to be indexed and searched, rather than forcing the user to decide -- ahead of time -- where the content will live in relation to other content and what its name shall be. A hierarchical system could be built on top of the flat storage, but it's not required.

As Alan Cooper states in his book, About Face, hierarchical filesystems are a kludge, designed for the computers of the 1960s and 1970s with limited memory and disk storage. Sadly, the popularity of Windows and Unix have guaranteed the dominance of the hierarchical filesystem to this day.

Barry Brown
+1  A: 

I think a very important invention for computing in the past 50 years was GOOGLE. The internet means nothing without a good tool to search it. The advent of search engine revolutionized the internet and enabled it to be monetized by the little guy.

Luke101
But you do know that search has been around a lot longer? Sure Google made it better and more mainstream but they hardly invented it.
Jonas
+2  A: 

Virtualization?
applications like VirualBox OSE or VMWare have saved me many hours.

Behrooz
CP-67 predates 1980 by a long shot.
Windows programmer
From http://www.kernelthread.com/publications/virtualization/: In the mid 1960s, the IBM Watson Research Center was home to the M44/44X Project, the goal being to evaluate the then emerging time sharing system concepts. The architecture was based on virtual machines: the main machine was an IBM 7044 (M44) and each virtual machine was an experimental image of the main machine (44X).
Charles Stewart
+5  A: 

Pretty much everything important in modern 3D computer graphics. Ray-tracing (in the compute graphics sense) got its jump start from Whitted's 1980 paper. Marching cubes ('87) is the standard way to extract an isosurface from 3D data.

Eric
+2  A: 

USB

Ravi
It's a serial bus standard. Serial data transmission is older than the general purpose computer. Does it have any "really new ideas"? It looks like a standardisation effort to me.
Charles Stewart
The new idea of USB focuses on end user ease of use. A tree of devices that can all communicate on the same bus is a huge improvement. This ease of use is why USB won out over all the other bus standards, in my opinion.
Shane Holloway
USB also allows more than 1 peripheral to connect to the computer's USB port. Lots more than 1, in fact.
Windows programmer
A: 

RAID (1988).

Arguably this is just an application of error correction codes from years gone by, but then arguably everything in computer science can be reduced to basic mathematics which has been around for millennia.

Mark Thomas
A: 

Personal Broadcast Communication

Facebook, Twitter, Buzz, Qaiku... the implementations are varying, focusing on different aspects - managed audience, conciseness, discussions. The specific services come and go, but the new concept of communication remains. Blogs are of course what started this, but the new services have made the communication socially connected, which is an essential difference.

Not quite sure if this exactly goes under the subject of computing, though, but it's something that's significant, and only made possible by computing and networks.

Ilari Kajaste
Any explanation for the downvote?
Ilari Kajaste
A: 

Open Croquet http://www.opencroquet.org - A Squeak, Smalltalk-based 3D environment which lets multiple users interact and program the environment from inside itself. It has it's own object replication protocol for sharing environments efficiently and scaleably over the internet. **It's difficult to describe because there just isn't anything else remotely like it...

1) I'm proposing this because when I try to explain to other people what it is I find them expecting me to compare it to other things... and I still haven't found anything remotely like it although there are many elements present from other systems (e.g. Smalltalk, Open GL, etoys, virtual worlds, remote collaboration, object-oriented replication architectures) the whole seems to be much more than the parts...

2) Unlike many of the technologies mentioned here it hasn't settled down into a widely exploited commercial niche...

Both points are signs of an early-stage technology.

I suspect that when Alan Kay started work on it, he might have been thinking about the theme of this question in the first place.

http://www.onlisareinsradar.com/archives/001281.php

cartoonfox
+1  A: 

Augmented Reality

Where a view the the real world is combined with virtual elements in some way.

The term Virtual Reality was coined in 1989 a few years before the term "Augmented Reality" came into existence.

Some early enabling technologies were invented before 1980 but the concept itself dates from the early nineties (at least that's what Wikipedia says.)

http://en.wikipedia.org/wiki/Augmented_reality#History

cartoonfox
Dup: http://stackoverflow.com/questions/432922/significant-new-inventions-in-computing-since-1980/2371766#2371766
Charles Stewart
A: 

I would say that CDMA was/is an important and powerful new idea that was created after 1980.

Charles Stewart
A: 

c++ programming language (1983) template metaprogramming (1994)

plan9assembler
What about C++ is meant to be a significant new invention? C++'s templates (like the C++ STL) are derived from Ada's generics (1977), which in turn were based on the meta-programming facility in Liskov's CLU.
Charles Stewart
+2  A: 

Maybe a forum of science fiction authors would give you more interesting answers? ;-)

I suspect theres a bit of a fallacy at work here, your viewing the history of technology and science as a steady march of progress, as a linear phenomenon. I suspect it is in fact a process of fits and starts, context, economics, serendipity and plain ole randomness.

You should feel fortunate that you were at the centre of one of the great waves of history, most people will never have that experience.

Paul Johnson