views:

813

answers:

31

What were the most significant events or milestones in the history of computer science?

I haven't been able to find a potted history, so I thought I'd see what views the SO community had on the question. I'm studying for a Masters in CS at the moment, so I'm hoping for some stuff to go take a look at that I've not come across before.

Related:
Computer science advances in past 5 years
Significant new inventions in computing since 1980

+22  A: 

Invention of Turing Machine

Roman
+ Turing's proof of the undecidability of the halting problem ...
mjy
I have to agree with this..
shake
I had a professor who claimed to have known Alan Turing. Apparently, he had a VW bus at one point, which he called his Turing Machine. Pretty punny for a genius.
mcliedtk
+5  A: 

Invention of the C programming language

JaredPar
I'd say LISP was actually more valuable. C might be great from an engineering perspective but for actual science ... not that much I think.
Joey
while undoubtedly used by many computer scientists, it's an engineering feat, C wasn't even particularly innovative as a language...
mjy
+20  A: 

Invention of the zero (0), ca 2000 BC

Henk Holterman
+4  A: 

First Bug

vkraemer
+3  A: 

Invention of the Williams Tube - the first RAM

Brabster
A series of tubes?
Kevin Panko
+3  A: 

Binary Number System - Though this predates Computers.

Andriyev
Binary digits are just a concession to a weakness of electronics.
Henk Holterman
In electronics, lower bases are often easier_to_implement/less_error_prone. The binary number system is more efficient in other places than just electronics. For example: normally we can only count up to 10 using our fingers. But if we count in binary using them, we can reach 1023.
Wallacoloo
@wallacoloo You say that base 2 is more efficient than base 1? Tell me more.
jleedev
@jleedev Did you already see my example of counting in base 2 vs 1? Now if you're talking electronics, how would _you_ implement base 1?
Wallacoloo
+4  A: 

Internet and first high-quality search engine

Roman
+1 for the internet, .1 for search
DaveDev
+11  A: 

The stored program. Before that, you didn't program so much as re-wire.

JustJeff
Von Neumann, http://en.wikipedia.org/wiki/Von_Neumann_architecture
Henk Holterman
+7  A: 

Garbage collection

Ben Griswold
Ahmen to that one.
wheaties
+6  A: 

First compiler (A-0 programming language, Grace Hopper, 1952)

[Source: Wikipedia]

Simon Nickerson
A: 

Linux and and other free software.

EDIT: Motivation: these tools give most CS people the tools to realize and do there research and development and to learn more. This have increased the peace and interest a lot in this subject. That is why it is important.

Rickard von Essen
-1, and not because of any feelings towards Linux (which I'm using to write this now). You could say that pens and paper have given "most CS people the tools to realize and do there research and development and to learn more. This have increased the peace and interest a lot in this subject. That is why it is important."
Roger Pate
@Roger Pate But free sw have specifically impacted CS, pen and paper have taken humanity where we are now in most (all) knowledges.
Rickard von Essen
+9  A: 

Invention of Lambda Calculus and funtional languages as a result.

Roman
+9  A: 

I would think the invention of solid state electronics/semiconductors/integrate circuits had a HUGE impact.

Also, publication of volumes 1-3 of The Art of Computer Programming. Maybe not "most" important, but it is a seminal work.

Charles
+1  A: 

The qubit .. if they can be entangled in sufficient quantity, a LOT of things will need to be reconsidered.

JustJeff
+1  A: 

Assembly language,

Compilers

N 1.1
+1  A: 

the World War II

cd1
+2  A: 

Invention of the algorithm (usually credited to Al-Khwārizmī in 9th century AD)

Simon Nickerson
I thought Al Gore invented them. Or was that something else?
kibibu
@kibbu HAHAHAHA! THAT JOKE IS STILL FUNNY IN 2010! HAHAHAHAHAHA /me dies
Andrew Heath
I've never heard anyone else tell that joke. Although I've done so. I think it's a pun that so obvious that everybody comes up with it on their own.
Wallacoloo
+2  A: 

Hash Function - allowing indexes in databases, etc, first reference in 1953, according to Wikipedia

Brabster
+4  A: 

A significant, although not positive, event was the first time a computation algorithm was patented.

James McLeod
+2  A: 

The idea of sticking to just two element states.

This allowed usage of electronic components for reliable calculations. Before that happened, attempts to play with more element states (there was an attempt with three states first) were not entirely successful. Different voltage ranges for different types of elements, jitter and other effects were disrupting reliable state recognition.

So someone came with an idea to minimize the number of states to the minimum which was still useful. That was two.

This is how binary system was born.

Developer Art
"...and it had notable advantages over the binary computers which eventually replaced it (such as lower electricity consumption and lower production cost)." http://en.wikipedia.org/wiki/Ternary_computer
Roger Pate
Duplicate of http://stackoverflow.com/questions/2458751/significant-events-in-computer-science/2458798#2458798
Roger Pate
A: 

I'd say Wiki & Web 2.0 has its significance, particularaly in the context of a site like StackOverflow. This is a knowledge centre that anyone can contribute to, and is constantly vetted to a great degree by its participants. Our species has never had anyting like this before for collaboration & information sharing.

DaveDev
But how is it a significant event *in Computer Science?*
Roger Pate
+1  A: 

Invention of Calculator.

Jojo Sardez
A: 

Y2K, wait, never mind.

mikerobi
No, this was still significant. It proved that software must be designed with consideration to the future. Sure computers didn't randomly blow up, but there were still quite a few changes made.
Wallacoloo
A: 

Pascal's calculator. 1645!

Mathias
+2  A: 

Indirect addressing

uncle brad
If you ever had to write self modifying code you would vote me up.
uncle brad
+2  A: 

Charles Babbage and the first real computer - its hard to have computer science history without a first computer!

wolfsnipes
It's hard to call Babbage's machine the first real computer, because it was never built. It may have been the first real computer *design*, but it was never a real computer.
Gabe
The fatal flaw with this design is that if one puts wrong figures in, the right answer does not come out.
Kevin Panko
+1  A: 

The first computer program, written sometime circa 1842-1843 by Ada Lovelace, the first programmer.

Gabe
+2  A: 

After recently reading about Charles P. Thacker, the 2009 Turing Award recipient, one idea would be to take a look at the list of Turing Award recipients and what some of their accomplishments were. This will take you back to 1966 and contains many advancements in computer science.

Bratch
+1  A: 

The Abacus http://en.wikipedia.org/wiki/Abacus

steve
+1 Here are some crayons to go with that, try not to mess things up.
uncle brad
A: 

Creation of ARPANET - the predecessor of Internet.

John Doe