views:

849

answers:

20

Computing as a discipline in its own right (rather than as a discussion of whether it is Mathematics or Physics) is a reasonably young science. Wherever you trace its roots (e.g. Turing's paper in 1936, Babbage's engines, ATLAS, ENIAC or LEO) it's much younger than most modern nation states.

I've been programming (using the loosest definition) for close to 30 years but it still seems to me that I come across parts of its history that are new to me. While it shouldn't surprise me that while I studied computing at school, at University, post-grad and now as a day-job, that there is stuff I don't know, it still causes a small pause when I read an article about something which seems pivotal but about which I know absolutely nothing.

For instance, this news item describes a conference which looks to have sown the seeds for many things that have come since. It was 1968, the world was young, IBM was old, Microsoft and Sun barely in glint in their creators' eyes; the notion of separating hardware and software was new, the largest institutions had networks that numbered in the dozens of machines. Yet some of their conclusions are fresh and remain unresolved, in particular managing large projects. (The proceedings are fascinating and full of lessons for the future software engineer.)

The question is YASOP (*) - what piece of computing history do you think still has significance to our current industry but you feel people don't know enough about?

(*) Yet Another Stack Overflow Poll

+29  A: 

Bletchley Park.

Back in WWII (or WW2 for my American friends ;) ), the modern computer was invented in England at Bletchley Park. After the war, the British government destroyed all evidence of its existence and swore everyone involved to secrecy. Thus the world came to assume the modern computer was invented in the USA a few years later. Even today - after the secrets of Bletchley Park have been revealed - most programmers remain oblivious to it.

(As a quick aside, it is also a national disgrace for us Brits that we haven't bothered to fund the upkeep of this historic site and it took a generous donation recently from US companies to highlight its plight)

David Arno
That's exactly the kind of answer I was after! I'd recommend a visit to Bletchley Park to anyone who finds themselves in the area. It's both inspiring and dispiriting in equal measure. The job they did and - as David points out - the mess we've made of their legacy.
Unsliced
We say "WWII" too. :-)
Just Some Guy
Even (or should I say "Especially"?) as a German, i was deeply saddened to hear how Bletchley Park was more or less abandoned. It is a national monument in my opinion.
Michael Stum
It's a bigger disgrace that when this answer was written, the British government still hadn't given Turing a post-humous apology for the way they treated him because he was homosexual. Keeping how big a war hero he was a secret had some security justification, harassing and torturing him until he committed suicide did not.
Jon Hanna
I agree with you completely Jon. Subsequently though Brown did issue an apology after the brilliantly successful on line campaign last year. So there was an eventual happy ending to that great man's life.
David Arno
+6  A: 

I have a rare piece of early computer trivia: the first computer program run on the ENIAC was a program to produce the first 1000 decimal places of Pi (source: my dad, who was there at the initial run). You could argue that this was the first "official" computer program ever.

devinmoore
Ada Lovelace would probably argue that one with you.
Skizz
She would, would she, and how many programs did she actually run?
PintSizedCat
From the wiki article "her [Ada Lovelace] work never ran ...The ENIAC programming team, consisting of Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Wescoff, Fran Bilas and Ruth Lichterman were the first working programmers."
PintSizedCat
Running the program is but a mere technicality. I'm sure if Mr Babbage had pulled his finger out she would have executed plenty.
Skizz
You don't get medals by being a slacker! Maybe if she wasn't around to distract him things would have been different.
PintSizedCat
+1  A: 

The Xerox Alto for introducing the world to GUIs.

Skizz

Skizz
...and Apple for completely ripping it off. Xerox had no business sense back then.
Gary Willoughby
Also, isn't the topic *unknown* pieces of computer history? I'm pretty sure lots of people (at least lots of geeks) know that Xerox made the first GUI system and mouse...
Nik Reiman
+2  A: 

SAGE, the Semi-Automatic Ground Environment. Our original air traffic control system. Designed in the 1950, operational from 1963 to 1983. Vacuum tubes, ferrite core memory, magnetic drums, and teletype machines. It was the first large scale computer control system. It tracked all domestic flights in the USA. I was stationed at Griffiss AFB which had one of the last operations SAGE systems.

Jim C
Wow. You don't mean Griffiss AFB in Rome, New York, do you? That's amazing. I didn't know I was so close to computing history, if so.
Thomas Owens
Operational from 1063 - so the French could have just taken a plane when they invaded England 3 years later? ;)
MSalters
+4  A: 

In June 1945, John von Neumann published a 10-page report titled "First Draft of a Report on the EDVAC". It contained the outline of pretty much every general-purpose computer built since. The EDVAC's two main innovations were the use of binary instead of decimal and that it was to be a stored-program computer.

Without stored-program computers, we would still be in the stone age of computing.

More here.

Dan Dyer
+2  A: 

Vannevar Bush and his memex.

pjz
A: 

Calvin Mooers invention of the TRAC programming language. It was ahead of its time in a number of ways, and unfortunately his attempt to control and profit from it probably lead to it not being more widely known.

feoh
+8  A: 

Somewhere in 1973 Ken Thompson sat on a single night coding session and implemented Doug McIlroy's vague idea of pipes into the early UNIX code and invented the "|" notation in its shell. This was the moment when the UNIX system took of. An article describes it like this:

The philosophy that everyone started to put forth was 'Write programs that do one thing and do it well. Write programs to work together. Write programs that handle text streams, because that is a universal interface.'

The development of pipes led to the concept of tools -- software programs that would be in a "tool box,"

Novices to the system could experiment, linking different commands together for what they thought should be the output. And very often their "pipes" worked the first time

mkoeller
A: 

The concept of the stored-program computer conceived by Presper Eckert and John Mauchly in 1944, but for which John von Neumann has unfairly taken the credit.

Rob Kam
+2  A: 

Ole-Johan Dahl and Kristen Nygaard developed the Simula languages, the first object oriented programming langauges in the 1960s.

CStick
+1  A: 

Quoting from Squeak wiki...

...In December 1979, the Xerox Palo Alto Research Center developed the first prototype for a GUI. A young man named Steve Jobs, looking for new ideas to work into future iterations of the Apple computer, traded US $1 million in stock options to Xerox for a detailed tour of their facilities and current projects. One of the things Xerox showed Jobs and other members of the Apple Lisa team was the Alto machine, which sported a GUI and a three-button mouse. When Jobs saw this prototype, he had an epiphany and set out to bring the GUI to the public.Apple Computer then commercialized and refined the GUI into a system very much like that we use today—a system which became nearly ubiquitous after its adoption in Microsoft Windows. The first popular personal computer, the Apple 2, was a hit - and made Steve Jobs one of the biggest names of a brand-new industry...

utku_karatas
Unfortunately, to this day, neither Jobs nor Gates understand what the Alto GUI was about, and thus both OSX and Windows still are lightyears behind a typical 1980s Smalltalk system.
Jörg W Mittag
+5  A: 

The public debut of the computer mouse on December 9, 1968 when Douglas C. Engelbart and the group of 17 researchers working with him in the Augmentation Research Center at Stanford Research Institute in Menlo Park, CA, presented a 90-minute live public demonstration of the Online System, NLS (which was the inspiration for the Xerox Alto).

Rob Kam
This demo also introduced graphical user interfaces, hypertext, videoconferencing, and email. Steven Levy called it "The mother of all demos".
Dour High Arch
+7  A: 

The invention of an algebraic system of logic by George Boole in 1847, which forms the basis of all modern digital computers. At the time it appeared to have no practical uses.

Approximately seventy years after Boole's death, Claude Shannon attended a philosophy class at the University of Michigan which introduced him to Boole's studies. Then in 1937 Shannon wrote a master's thesis at MIT, in which he showed how Boolean algebra could optimize the design of systems of electromechanical relays (telephone exchanges), and that circuits with relays could solve Boolean algebra problems.

Victor Shestakov at Moscow State University proposed a theory of electric switches based on Boolean logic earlier than Claude Shannon in 1935 but the first publication of Shestakov's result took place only in 1941.

Rob Kam
+5  A: 

>> Konrad Zuse <<

+1  A: 

JCR Licklider, his overall vision for man machine symbiosis. He put his money( ARPA money) where his mouth was by funding all the major research in the 50s and the 60s that gave us the technologies which make the current PC. Another one would be Claude Shannon's information theory. Not only did this infulence the direction of computing but also later was influential in the development of crytography outside of NSA and in the public domain.

HeretoLearn
+2  A: 

Louis Pouzin invented both one of the first packet-switching networks and the concept of the shell and its commands.

js
A: 

I'd have to go with the development of numerical weather prediction, based on the principles which were first derived by Lewis Fry Richardson. He proposed that the primitive equations could be solved using a finite-differencing scheme, but when he attempted it by hand the result was off by an order of magnitude (it turns out that he didn't account for sound waves, which arise as a result of the compressibility of the atmosphere and propagate much faster than gravity/baroclinic waves).

It wasn't until the advent of ENIAC that Charney and numerous other American scientists were able to complete the first successful numerical forecast by solving the barotropic vorticity equation (a simplified and boiled-down version of the equations of atmospheric motion). From their early work, the entire field of numerical weather prediction has been refined over the past 60 years in an attempt to accurately reproduce and forecast the state of the atmosphere. You can actually download a port of their original code to MATLAB, and solve the same set of equations in about 15 seconds that it took them about 12 hours to do.

Advances in computing power, the advent of massively parallel computing, and specialized programming languages (mostly FORTRAN) have allowed us to attack problems such as complex boundary layer flow, the development of tornadoes, and gain an understanding of natural climate variability. And of course, they make the TV met's job a lot easier too ;)

Michael Morris
+2  A: 

Google secret history of silicon valley. They recently had a tech talk about how the CIA and NSA developed the major players early on as a form of tech transfer. Turns out that the war played a greater role in the development of the valley than we knew.

William Resing
I love this answer :)
Unsliced
+1  A: 

Not so much a moment, but the PLATO system seems relatively untalked about, if not unknown. It's credited as the first computer assisted learning system. One of the designer's goals was to bring university-level education and access to technology to everyon, though costs prevented this from happening. Later versions featured monochrome plasma displays.

In 1972, researchers from Xerox took a tour and adapted some of what they saw for their work in Palo Alto. PLATO didn't have as big an impact on the work at PARC as PARC had on a certain other company, but the story is still oddly familiar.

On the darker side, PLATO's messaging system was the direct inspiration for Lotus Notes.

outis
A: 

1870, Émile Baudot designs the first binary character encoding, enabling a translation between human-readable text and binary numbers.

Jon Hanna