views:

561

answers:

6

I know GC wasn't popular in the days when Ada was developed and for the main use case of embedded programming it is still not a good choice.

But considering that Ada is a general purpose programming language why wasn't a partial and optional (traces only explicitly tagged memory objects) garbage collector introduced in later revisions of the language and the compiler implementations.

I simply can't think of developing a normal desktop application without a garbage collector anymore.

+16  A: 

Ada was designed with military applications in mind. One of the big priorities in its design was determinism. i.e. one wanted an Ada program to consistently perform exactly the same way every time, in any environment, under all operating systems... that kinda thing.

A garbage collector turns one application into two, working against one another. Java programs develop hiccups at random intervals when the GC decides to go to work, and if it's too slow about it there's a chance that an application will run out of heap sometimes and not others.

Simplified: A garbage collector introduces some variability into a program that the designers didn't want. You make a mess - you clean it up! Same code, same behavior every time.

Not that Ada became a raging worldwide success, mind you.

Carl Smotricz
Okay but why does Ada still not have a garbage collector after 2 major language revisions. Not even an optional one? C++ didn't get it for only one reason - there wasn't enough time to specify the stuff correctly.
Lothar
Nothing in the language says "no garbage collector" as you are implying. If somebody wants one in their compiler, they are free to put one in. Some compilers have.
T.E.D.
Sorry, this answer is just plain wrong, for instance, large sections of the tasking model in Ada is non-determistic.There is nothing in the Ada language spec that prevents GC - and there are GC optional add on packages if you want them.
YermoungDer
+8  A: 

Because Ada was designed for use in defense systems which control weapons in realtime, and garbage collection interferes with the timing of your application. This is dangerous which is why, for many years, Java came with a warning that it was not to be used for healthcare and military control systems.

I believe that the reason there is no longer such a disclaimer with Java is because the underlying hardware has become much faster as well as the fact that Java has better GC algorithms and better control over GC.

Remember that Ada was developed in the 1970's and 1980's at a time when computers were far less powerful than they are today, and in control applications timing issues were paramount.

Michael Dillon
+4  A: 

the answer is more complicated: Ada does not require a garbage collector, because of real-time constraints and such. however, the language have been cleverly designed so as to allow the implementation of a garbage collector.

although, many (almost all) compilers do not include a garbage collector, there are some notable implementation:

  • a patch for GNAT
  • Ada compilers targetting the Java Virtual Machine (i don't if those project is still supported). it used the garbage collector of the JVM.

there are plenty other sources about garbage collection in Ada around the web. this subject has been discussed at length, mainly because of the fierce competition with Java in the mid '90s (have a look at this page: "Ada 95 is what the Java language should have been"), when Java was "The Next Big Thing" before Microsoft drew C#.

Adrien Plisson
+2  A: 

Fist off, there is nothing in the language really that prohibits garbage collection.

Secondly some implementations do perform garbage collection. In particular, all the implementations that target the JVM garbage collect.

Thirdly, there is a way to get some amount of garbage collection with all compilers. You see, when an access type goes out of scope, if you specifially told the language to set aside a certian amount of space for storage of its objects, then that space will be destroyed at that point. I've used this in the past to get some modicum of garbage collection. The declaration voodo you use is:

type Foo is access Blah;
for Foo'storage_size use 100_000_000; --// 100K

If you do this, then all (100K of) memory allocated to Blah objects pointed to by Foo pointers will be cleaned up when the Foo type goes out of scope. Since Ada allows you to nest subroutines inside of other subroutines, this is particularly powerful.

To see more about what storage_size and storage pools can do for you, see LRM 13.11

Fourthly, well-written Ada programs don't tend to rely on dynamic memory allocation nearly as much as C programs do. C had a number of design holes that practicioners learned to use pointers to paint over. A lot of those idioms aren't nessecary in Ada.

T.E.D.
A: 

First off, I'd like to know who's using Ada these days. I actually like the language, and there's even a GUI library for Linux/Ada, but I haven't heard anything about active Ada development for years. Thanks to its military connections, I'm really not sure if it's ancient history or so wildly successful that all mention of its use is classified.

I think there's a couple of reason for no GC in Ada. First, and foremost, it dates back to an era where most compiled languages used primarily stack or static memory, or in a few cases, explicit heap allocate/free. GC as a general philosophy really only took off about 1990 or so, when OOP, improved memory management algorithms and processors powerful enough to spare the cycles to run it all came into their own. What simply compiling Ada could do to an IBM 4331 mainframe in 1989 was simply merciless. Now I have a cell phone that can outperform that machine's CPU.

Another good reason is that there are people who think that rigorous program design includes precise control over memory resources, and that there shouldn't be any tolerance for letting dynamically-acquired objects float. Sadly, far too many people ended up leaking memory as dynamic memory became more and more the rule. Plus, like the "efficiency" of assembly language over high-level languages, and the "efficiency" of raw JDBC over ORM systems, the "efficiency" of manual memory management tends to invert as it scales up (I've seen ORM benchmarks where the JDBC equivalent was only half as efficient). Counter-intuitive, I know, but these days systems are much better at globally optimizing large applications, plus they're able to make radical re-optimizations in response to superficially minor changes.Including dynamically re-balancing algorithms on the fly based on detected load.

I'm afraid I'm going to have to differ with those who say that real-time systems can't afford GC memory. GC is no longer something that freezes the whole system every couple of minutes. We have much more intelligent ways to reclaim memory these days.

Tim Holloway
Well, there is a new version of the Ada standard being worked on, it is being actively developed. The GCC Ada compiler is actively developed too.
T.E.D.
I think you are onto something in your third paragraph. Ada is definitely a language that caters to control freaks. If you want, you can go in and tell the compiler precisely what offsets and which bits to use for every field in a record type. You can't even do that in C. Someone who likes that kind of thing isn't liable to be a fan of sloppy dynamic memory usage practices.
T.E.D.
+1  A: 

Your question is incorrect. It does. See the package ada.finalization which handles GC for you.

Nigel.
That is really more of a method of providing for RAII, like you can with your typical C++ class. If you want to use it to GC stuff allocated in an Ada class, you have to write it (correctly!) to do that. It's not quite the same as general brainless GC like Java requires of its compilers. However, I believe all Ada compilers targeting the JVM (Java Virtual Machine) use its garbage collection.
T.E.D.