views:

1190

answers:

7

Hello all,

I've been thinking of this question very long, but really couldn't find the answer on Google as well a similar question on Stackoverflow. If there is a duplicate, I'm sorry for that.

A lot of people seem to say that writing compilers and other language tools in functional languages such as OCaml and Haskell is much more efficient and easier then writing them in imperative languages.

Is this true? And if so -- why is it so efficient and easy to write them in functional languages instead of in an imperative language, like C? Also -- isn't a language tool in a functional language slower then in some low-level language like C?

Thanks in advance,

William v. Doorn

+19  A: 

A lot of compiler tasks are pattern matching on tree structures.

Both OCaml and Haskell have powerful and concise pattern matching capabilities.

It's harder to add pattern matching to imperative languages as whatever value is being evaluated or extracted to match the pattern against must be side-effect free.

Pete Kirkham
Sounds like a reasonable answer, but is this the only thing? e.g. would things such as tail recursion also play a role?
wvd
That would seem to indicate that it is more of an issue of the type system than of the actual execution model. Something based on imperative programming with immutable values over structural types might be fine.
Donal Fellows
@wvd: Tail recursion optimization is an implementation detail, not a language feature as such, that makes linear recursive functions equivalent to an iterative loop. A recursive function to walk a linked list in C would benefit from it just as much as recursing on a list in Scheme does.
camccann
@wvd gcc C has tail call elimination, as do other mutable state languages
Pete Kirkham
@wvd: I would say no on the tail recursion, in the end it is only an optimization and lots of lisps are great for writing compilers in, and only a subset of those promise TCO.
Ukko
As the JMatch project (http://www.cs.cornell.edu/Projects/jmatch/) shows, it's not impossible to add powerful pattern matching to a completely non-functional language like java. However it is a fact that currently the only common languages that have pattern matching are (at least partly) functional in nature.
sepp2k
@camccann: If the language standard guarantees tco (or at least guarantees that recursive functions of a certain form will never cause a stack overflow or a linear growth of memory consumption), I'd consider that a language feature. If the standard doesn't guarantee it, but the compiler does it anyway, it's a compiler feature.
sepp2k
@sepp2k: It's an odd case because it's an "optimization" that's all but required for a functional language compiler to produce useful output. Language specifications do sometimes dictate implementation details or performance bounds for various reasons, but I have a hard time calling such things features of a language itself. Naught but quibbling over terminology, though, so disregard me at will!
camccann
+41  A: 

Often times a compiler works a lot with trees. The source code is parsed into a syntax tree. That tree might then be transformed into another tree with type annotations to perform type checking. Now you might convert that tree into a tree only containing core language elements (converting syntactic sugar-like notations into an unsugared form). Now you might perform various optimizations that are basically transformations on the tree. After that you would probably create a tree in some normal form and then iterate over that tree to create the target (assembly) code.

Functional language have features like pattern-matching and good support for efficient recursion, which make it easy to work with trees, so that's why they're generally considered good languages for writing compilers.

sepp2k
Most complete answer so far, I'll mark this as the accepted answer, however I think Pete Kirkham's answer is also good.
wvd
What about "prooving correctness", since correctness of a compiler is an important attribute, I have often heard that fans of functional languages incorporate a "proof" of correctness into their workflow somehow. I have no idea what that really means in practical terms, but as compiler reliability is important, this seems worthwhile.
Warren P
@WarrenP: The "proof-carrying code" concept comes from statically-typed functional languages. The idea is that you use the type-system in such a way so that a function can only typecheck if it's correct, so the fact the code compiles is the proof of correctness. Of course this isn't fully possible while keeping the language turing-complete and typechecking decidable. But the stronger the type-system, the closer you can get to that goal.
sepp2k
The reason that this concept is mainly popular in the functional community is that in languages with mutable state, you'd also have to encode information about when and where state change occurs in the types. In languages where you know that the result of a function only depends on its arguments, it's much easier to encode a proof in the types (it's also much easier to manually proof the code's correctness because you don't have to consider which global state is possible and how it will affect the behavior of the function). However none of this is specifically related to compilers.
sepp2k
@Warren P: Typically that means a mathematical specification of what the program should do, and a proof that the program in fact does that. In compiler terms, this would mean something like "given this definition of the source language, the output code will perform the same computations, and any optimizations will alter only the time/space usage of the program, not its behavior".
camccann
The single most important feature is pattern matching in my opinion. Optimizing an abstract syntax tree with pattern matching is stupidly easy. Doing it without pattern matching is often frustratingly hard.
Bob Aman
+6  A: 

One important factor to consider is that a big part of any compiler project is when you can self-host the compiler and "eat your own dog food." For this reason when you look at languages like OCaml where they are designed for language research, they tend to have great features for compiler-type problems.

In my last compiler-esque job we used OCaml for exactly this reason while manipulating C code, it was just the best tool around for the task. If the INRA folks had built OCaml with different priorities it might not have been such a good fit.

That said, functional languages are the best tool for solving any problem, so it logically follows that they are the best tool for solving this particular problem. QED.

/me: crawls back to my Java tasks a little less joyfully...

Ukko
-1 for "functional languages are the best tool for solving any problem." If this were true, we'd all be using them everywhere. ;)
Andrei Krotkov
@Andrei Krotkov: Todays word of the day is fa·ce·tious Pronunciation: \fə-ˈsē-shəs\Function: adjectiveEtymology: Middle French facetieux, from facetie jest, from Latin facetiaDate: 15991 : joking or jesting often inappropriately : waggish <just being facetious>2 : meant to be humorous or funny : not serious <a facetious remark>synonyms see wittyOn top of missing the joke, your logic is still flawed. You are assuming that all people are rational actors, and that I am afraid, is not a fair assumption.
Ukko
I guess I missed the joke, as I know people in real life that would say pretty much the exact thing, except fully seriously. Poe's law I guess. http://tvtropes.org/pmwiki/pmwiki.php/Main/PoesLaw
Andrei Krotkov
@Andrei: Using your *argumentum ad populum* : "If reason is better than emotional ignorance, we'd all be using it everywhere."
Tim Schaeffer
+2  A: 

See also

http://stackoverflow.com/questions/1936471/f-design-pattern

FP groups things 'by operation', whereas OO groups things 'by type', and 'by operation' is more natural for a compiler/interpreter.

Brian
This relates to what is called, in some Programming Language Theory circles, the "expression problem". For example, see [this question](http://stackoverflow.com/questions/2807629/), wherein I demonstrate some truly horrible Haskell code that does things the "extensible types" way. Contrariwise, forcing an OOP language into the "extensible operations" style tends to motivate the Visitor Pattern.
camccann
+2  A: 

Seems like everyone missed another important reason. It's quite easy to write a embedded domain specific language (EDSL) for parsers which look a lot like (E)BNF in normal code. Parser combinators like Parsec are quite easy to write in functional languages using higher-order functions and function composition. Not only easier but very elegantly.

Basically you represent the most simplest generic parsers as just functions and you have special operations (typically higher-order functions) which let you compose these primitive parsers into more complicated, more specific parsers for your grammar.

This is not the only way to do parer frameworks of-course.

snk_kid
+2  A: 

Basically, a compiler is a transformation from one set of code to another — from source to IR, from IR to optimized IR, from IR to assembly, etc. This is precisely the sort of thing functional languages are designed for — a pure function is just a transformation from one thing to another. Imperative functions don't have this quality. Although you can write this kind of code in an imperative language, functional languages are specialized for it.

Chuck
+2  A: 

One possibility is that a compiler tends to have to deal very carefully with a whole host of corner cases. Getting the code right is often made easier by using design patterns that structure the implementation in a way that parallels the rules it implements. Often that ends up being a declarative (pattern matching, "where") rather than imperative (sequencing, "when") design and thus easier to implement in a declarative language (and most of them are functional).

BCS