views:

883

answers:

16

Note that this is not about the .NET CLR that Microsoft is thrusting into the atmosphere to evangelize the concept of managed code. Most of you know that managed code has been around for quite some time and isn't very related to rocket science.

What I would like to know is why the concept of runtime security in the evolution of computers came so late.

I know this is like asking "why didn't the first Model T Ford come with airbags and seat belts?". The relevance of the question still stands despite this though, because it's well within human nature to protect againts known dangers. E.g. the first T-Ford didn't go fast enough to motivate airbag research. It didn't go fast enough for people to make fatal judgemental errors so often that it would motivate seat belt as law and standard in many countries.

In computer evolution it's almost the other way around. We started out with assembler, the equivalent of driving a T-Ford at 200mph with an eye-patch. I've had the pleasure of conversating with a few old truckers from this era, hearing these stories about hand-assembling assembly code, human debuggers, grillion lines of code etc. If we make a really nasty error in C, we might end up with a bluescreen. Decades ago, you could end up with damaged hardware and god knows what. But it's a mystery to me - so many decades, and all we did to make crashing less painful was the bluescreen (sorry for using MS as archetype for anything).

It's not only within human nature to protect against known dangers, it's also within any programmer's nature to automate and systemize common facilities, like error checking, memory diagnostics, logging frameworks, backup maintenance etc etc.

Why didn't programmers/humans start to automate the task of ensuring that code they feed to the system won't harm the system?. Yes, ofcourse, performance. But hey, this was well before any seriously penetrating hardware standard. Why didn't motherboards get designed with bus architectures and extra processors to facilitate "managed code"?

Is there any metaphor to Model T Fords not being fast enough that I'm missing?

+9  A: 

Computers weren't powerful enough and making them powerful enough was too expensive. When you've only got limited resources at your disposal, every byte and CPU cycle counts.

The first computer I used was a Sinclair ZX Spectrum in 1982. It had less RAM (16K) than the size of a single Windows' font file today. And that was relatively recently, in the home computer age. Before the mid-1970s the idea of having a computer in your home was inconceivable.

John Topley
Don't forget that every processor cycle counts too.
sharptooth
timeslots... oh wait, that is cloudcomputing ;)
Mafti
I find that explanation too simple. How did the cost of computing power correlate to the cost of debugging and quality problems?
sharkin
This is a simple take on it, but the point is valid.
xan
The computing power did not exist. In 1982, any computer powerful enough to do "managed code" would have been prohibitively expensive and slow. Reference the Intel iAPX 432 computer, a beautiful, object-oriented computer, that all but implemented Ada on a chip set. The emulator for it executed one instruction per second. That's not a typo.
John Saunders
This is a great answer. Even today, games on the Wii don't run managed code. It's simply a matter of prerequisites.
Nosredna
+2  A: 

In 1970, the cost of memory was around $1/bit (without inflation). You cannot afford the luxury garbage collection with costs like that.

DrJokepu
It's not about garbage collection. That's just one of the mechanisms within a managed environment. Maybe not even the most important one.
sharkin
n, its not just about garbage collection - you can have GC in native code. However, the cost of hardware is a factor in why we didn't have GC from the start (or even in some types of application today)
gbjbaanb
It's about garbage collection. You really can't have a managed environment without it.
Joshua
+11  A: 

Actually, managed code has been around for a very long time. Consider:

  • LISP
  • Smalltalk
  • BASIC (original flavour)

All provided operating system-like environments which protected the use from memory and other resource control issues. And all were relative failures (BASIC only really succeeded when features like PEEK & POKE that allowed you to mess with the underlying system were introduced).

anon
Don't forget Ada.
Nils Pipenbrinck
BASIC and LISP were interpreted languages, not managed languages.
John Saunders
So If I run LISP on .NET, it's not managed? And many BASIC implementations were compiled to byte code on a line by line basis before being executed, just like .NET languages today.
anon
There is more to "managed" code than using a JIT or interpreter. The Garbage Collector is a major factor in a managed environment.
Colin Desmond
lisp has been a compiled language since 1962.
Vatine
@colin - The LISP, Smalltalk and BASIC environments are all garbage collected.
anon
+17  A: 

Managed code built in security etc. has been around for a long time.

There just wasn't room for it in the original PC platform and it never got added in later.

The venerable IBM mainframe has has protected addressing, untouchable kernal libraries, role based security etc. etc. since the 70s. Plus all that Assembler code was managed by a sophisticated (for the time) change management system. (Univac, Burroughs etc had something similar.)

Unix had fairly decent security built in from the beginning (and it hasn't changed very much over the years).

So I think this is very much a windows/web space problem.

There has never been a mainframe virus! Most of the financial transactions in the world pass through these systems at some point so its not as if they werent an attractive target.

The internal IBM mail system did host the first 'trojan' though!

James Anderson
Plus of course IBM has been supporting virtualisation of operating systems for arouund 30 years!"
anon
Protected address spaces are not managed code.
John Saunders
This is pretty much depends on the compiler, not on the underlying OS. What has existence or absence of managed code to do with windows?
jn_
Kelden Cowan
+4  A: 

The same reason why There were no trains 300 years ago . The same reason why There were no cell phones 30 years ago. The same reason why we still don't have teleportation machine.

Technology evolves over time , it is called evolution.

The computers wasn't powerful enough back then. running a garbage collector at the background would have kill you application performance.

Managed code is not centered around garbage collection.
sharkin
It is centered around virtual machines, which essentially emulate machine code. While a good vm should execute instructions in only O(n) time, that extra O(n) was quite a lot for a long time. I remember trying to run java or homebrew game boy games on my 233 MHz pentium. It was painful.
Kelden Cowan
@R.A - Automatic memory management _is_ a cornerstone of managed execution. Not having to think about memory layout and usage is probably the single greatest advantage, from a software engineering standpoint, of managed code. See http://paulspontifications.blogspot.com/2007/09/composability-and-productivity.html for a good perspective.
codekaizen
A: 

I'd say it's largely been change resistance coupled with false perception of inefficiency of garbage collection that's delayed the adoption of GC and related techniques. Of course the brain dead segmented memory model on Intel 8086 didn't exactly help promote sane memory management on the PC.

TrayMan
The 80386 pretty much solved the issue of segmentation being a problem. It was an issue only because 8086 had only physical segments, no virtual segments.
John Saunders
80386 only solved the hardware problem, but by then it was also a software problem. It took some time until most PCs were running 32-bit OSes and the segmented memory model was finally abandoned. Thereon Java, combined with the popularity of C and C++, likely contributed to the perception of GC being infeasible. Ironically, Java appears to be a significant factor behind this recent trend of increasing use of GC.
TrayMan
I tell you three time: all the world is *not* an x86. If this was the limit you'd have seen "managed" code on *nix systems or macs or what was that *other* m68k OS with the fan boys?
dmckee
And if x86 wasn't the limit, you'd expect to see 'managed code' in the same era on other platforms. And guess what? You do! Of course the orignal question was clearly PC centric, so discussing why programming cultures around some other platforms may or may not have adopted such techniques doesn't really answer the question why Microsoft is evangelizing .NET CLR at this moment. And just as a reminder: The best technology does not always win.
TrayMan
A: 

The answer becomes clearer - humans weren't built for writing programs. Machines should be doing it and letting us relax by playing pacman.

Sam
A: 

Just guessing here, but don't managed languages require a virtual machine to execute? An old machine that simulates another machine sounds like a performance killer. I would say that today's mobile phones have higher execution speeds and memory capacity that those oldish computers "back then". Stuff is moving from fast and slim into slower and more abstract while hardware upgrades keep compensating for the slowdown.

Just my .02, and a lot of guesses.

Simon Svensson
Managed languages need a virtual machine to execute within today, since the more popular computers and operating systems are built the way they are. My question addresses the issue on a more fundamental level. E.g. why didn't early CPU design (and OS design) include the security measures taken by virtual machines today?
sharkin
+6  A: 

Let's think this through from first principles.

A managed platform provides a relatively sandboxed area to run program code that is created from the high level language to a form more suitable to be executed by the platform (IL bytecodes) . In there are also utility features like garbage collection and module loading.

Now think about a native application - the OS provides a relatively sandboxed area (a process) to run program code that is created from a high level language to a form more suitable to be executed by the platform (x86 opcodes). In there are also utility features like virtual memory management and module loading.

There's not much difference, I think the reason we have managed platform in the first place is simply because it makes coding the platform easier. It should make the code portable between OSes, but MS didn't care for that. Security is part of the managed platform, but should be part of the OS - eg. your managed app can write files and similar, just like a normal process. Restricting that is a security feature, it isn't an aspect of a managed platform that doesn't exist on native.

Ultimately, they could have put all those managed features into a set of native dlls and scrapped the idea of the intermediary bytecode, JIT compiling to native code instead. "Managed" features like GC is easily possible on native heaps - see the Boehm C++ one for an example.

I think MS did it partly because it made the compiler easier to write, and partly because that's how Java was made (and .NET is very much a descendant of Java, if only in spirit), though Java did it that way to make cross-platform coding possible, something MS doesn't care for.

So, why didn't we get managed code from the start - because all the things you mention as being part of 'managed' code, are native code. Managed platforms we have today are simply an additional abstraction on top of an already abstracted platform. High-level languages have had more features added to them to protect you from yourself, buffer overflows are a thing of the past, but there's no reason they couldn't have been implemented in C when C was first invented. Its just that they weren't. Perhaps hindsight makes it seem like these features were missing, but I'm sure in 10 years time, we'll be asking "why didn't C# implement the obviously useful feature XYZ like we have today"

gbjbaanb
One benefit of IL over native code, is that it's verifiable (native code can be very hard, or impossible to verify). And because it's verifiable fine grained security can be applied (for unmanaged code fine grained security can't be applied, think ActiveX (active X addins have the same security level as the host, on the other hand a host for managed code can decive the security level it allows for it's addins).
Pop Catalin
+7  A: 

Just for the record, we never hand-compiled assembly. We hand-assembled assembly language code. Now that that's clear...

Your analogy is clouding the question because the speed of the car is not analogous to the speed of the computer in this sense: The increasing speed of the car necessitated the changes in auto safety, but it's not the increased speed of the computer that drives the need for changes in computer security, it's the increase in connectivity. From a slightly different angle: For the car, increasing speed is the driving technology for increasing safety. For computers, increasing speed is the enabling technology for increasing safety.

So, the first cars were safe in accidents because they were slow. The first computers were safe because they weren't networked.

Now, cars are made safer through seat belts, air bags, ABS, anti-collision devices, and so forth. Computers are made safe through additional techniques, although you still can't beat unplugging the network cable.

This is a simplification, but I think it gets at the heart of it. We didn't need that stuff back then, because computers weren't connected to the network.

Don Branson
Great angle, thanks!
sharkin
And I'll update the post to be correct about what you actually did with your hands :-)
sharkin
Thanks, and LOL.
Don Branson
Every serious computer (OK, not my Atari 400) that I've used since 1979 has been networked. This idea that the net is something new is, frankly, bollocks.
anon
Sorry, Neil, beg to differ. Sure, Arpanet's been around a long time, but most of us back then were connecting our home computers to BBSs, not to Arpanet. Now that it's common for home computers to be connected to the network all the time, that's a higher level of exposure than dialing up a BBS and downloading software. Plus, now that there's more exposure, there are more people interested to exploit it, compounding the threat.
Don Branson
Actually, I was thinking of JANET (Joint Academic Network)
anon
A: 

For what it's worth, I read a couple of papers for my computing languages class (one by CAR Hoare and another by Nicholas Wirth) advocating exactly this back in the 60s and 70s among other things.

I can't speak to exactly why these things didn't happen, but my guess is that it's just one of those things that looks obvious in hindsight that wasn't obvious at the time. It's not that earlier compilers weren't concerned about security. It's that they had different ideas about how to do this.

Hoare mentions the idea of a "checkout compiler". As far as I can tell, this is essentially a compiler that does static analysis. To him, this was a popular technique that failed (or at least didn't solve as many problems as it was inteneded to solve). The solution to him was to make programming languages more secure by creating managed code (or at least that's how he would have put it in modern terms).

I'd imagine that once C (and later C++) caught on, the idea of managed code was essentially dead. It's not that C was a bad language, just that it was intended to be an assembly language rather than an application programming language.

If you get a chance, you might read Hints on programming-language design. It's a pretty good read if you're interested in this kind of thing.

Jason Baker
A: 

Best answer to this question is, IMHO, nobody had an idea of managed code at that time. Knowledge actually evolves over time. As compared to fields like architecture or agriculture, computer science is a very young field. So the collective knowledge about the field is also young and will evolve over time. Perhaps in a few years we come across some new phenomenon and someone will be asking the same question, "why didn't somebody think of XYZ beofore?".

Simple not true. All the ideas *were* there. People thought and wrote about what you could accomplish with them. But they were *expensive* (in money, memory, execution speed, design cycle, whatever...).
dmckee
Indeed, Lisp machines were designed with hardware support for garbage collection from the mid-1970s.
pjc50
+3  A: 

Speaking to your question of why computers didn't have the protection mechanisms on the level of managed code, rather than why VMs couldn't run on slow hardware (already explained in other posts). The short answer is that it was. CPUs were designed to throw an exception when bad code happened so that it wouldn't damage the system. Windows handles this notoriously poorly, but there are other OSs out there. Unix passes it as signals so that the programs get terminated without bringing down the system. Really whether or not you are running managed code or not, a null pointer exception will result the same way - in program termination. Virtual memory ensures that programs don't mess with other code, so all they can do is hurt themselves.

Which brings me to my second point. All this is unnecessary if you know what you are doing. If I want to keep my furniture clean, I simply don't drop food on it. I don't need to cover my house in plastic, I just have to be careful. If you're a sloppy coder the best VM in the world isn't going to save you, it will just allow you to run your sloppy code without any noise. Also, porting code is easy if you use proper encapsulation. When you are a good coder, managed code doesn't help extensively. That is why not everyone is using it. It is simply a matter of preference, not better / worse.

As far as run-time security goes, there's nothing a P-code compiler can predict that a machine code can't, and nothing a managed code interpreter can handle that the OS can't (or doesn't) already. Motherboards with extra buses, CPUs and instruction sets cost a lot more money - IT is all about the cost/performance ratio.

Kelden Cowan
"All this is unnecessary if you know what you are doing" -- Absolutely true. But it limits the number of people who can do good work, and speed with which good work can be done. Which is why it was the right choice in the '70s and '80s, and still is for embedded and life- or mission-critical work, but may not make the most sense for banging out utility apps for the web...
dmckee
I think poor coders, given an easy-to-use tool, still write poor code.
gbjbaanb
@gbjbaanb: Good enough tools let poor coders write less-that-optimal-but-still-working code, as opposed to totally-and-dangerously-broken code. It's a small gain, but...
dmckee
+1  A: 

I think like most questions, "Why did we not have X in programming Y years ago" the answer is speed/resource allocation. With limited resources they needed to be managed as effectively as possible. The general purpose type of management associated with managed code would have been too resource consuming to have been of benefit in performance critical applications of the time. This is also part of why today's performance critical code is still written in C, Fortran or assembler.

jheriko
+1  A: 

Why didn'we just build airplanes and spaceships at once, instead of mucking around with horse-and-carriage and all that tedious stuff?

Lasse V. Karlsen
A: 

In a single word, Cost.

Agent389