views:

462

answers:

7

I'm looking for a Haskell compiler that uses strict evaluation by default instead of lazy evaluation. I would just use OCaml, but Haskell's syntax is so much better than OCaml's (and Haskell is pure, and has cool features such as type classes).

I'd really rather not constantly put !s and $!s all over my program. A compiler with a switch or a preprocessor to put in the strictness annotations would be really nice. It would also be helpful if there was a way to use lazy evaluation in certain places too, just in case I want something like an infinite list (I probably never will).

Please do not try to convince me that lazy evaluation is better, I really need the performance. IIRC, Simon Peyton Jones even said that lazy evaluation wasn't really necessary, it was there mostly to prevent them from making the language impure.

+8  A: 

There have been two attempts at strictly evaluating Haskell in the past:

But both were focused on sticking to Haskell's non-strict semantics but using a mostly-strict evaluation strategy, rather than actually changing the semantics, and neither ever really saw the light of day.

Edit: Martijn's suggestion of strict-plugin looks ideal for your purposes as it actually does what you want and the author is still active in the Haskell community, I'd forgotten about it.

Ganesh Sittampalam
+4  A: 

I think that Jan-Willem Maessan's pH compiler is/was strict. The next closest is Robert Ennal's speculative evaluation fork for ghc 5. The spec_eval fork is not strict, but instead optimistically evaluates. I don't know if either of those are still current/usable/etc.

shapr
+6  A: 

See also ghc-strict-plugin, an example for GHC's plugin framework, described in the Monad Reader 12.

Martijn
+6  A: 

I'd really rather not constantly put !s and $!s all over my program

You're doing it wrong, if that's how you're programming Haskell :) You simply won't need to do this. Use GHC, use -O2, use strict data types when appropriate, use lazy ones when appropriate. Don't assume laziness is going to be a problem - it is a solution to a lot of problems.

Don Stewart
+3  A: 

I feel your pain. My biggest PITA in my day-to-day programming is dealing with those !@#$%^&( space leaks.

However, if it helps, with time you do learn (the hard way) about how to deal with this, and it does get better. But I'm still waiting for Andy Gill to come out with his magical space leak profiler to fix all of my problems. (I'm taking his off-hand comment to me at the last ICFP that he'd dreamed up this cool idea as a promise to implement it.)

I won't try to convince you that lazy evaluation is the best thing in the world, but there are certain good points about it. I've got some stream-processing programs that scoot lazy lists through any variety of combinators that run happily on gigabytes of data while using only 3.5 MB or so of memory (of which more than 2MB is GHC runtime). And someone smarter than I am pointed out to me last year that you would really be quite surprised, as a typical Haskell programmer, how much you depend on lazy evaluation.

But what we really need is a really good book on dealing with lazy evaluation in the real world (which is not so different from the academic world, really, except they simply don't get a paper published, and we get clients coming after us with knives) that will properly cover most of the issues relating to this and, more importantly, give us an intuitive sense of what's going to explode our heap and what isn't.

I don't think that this is a new thing; I'm sure other languages and architectures have been through this too. How did the first programmers to deal with hardware stacks and all that, after all? Not so well, I bet.

Curt Sampson
BTW, you can, with the help of Template Haskell, make all sorts of things instances of `NFData`, and get strict out the wazoo, many, many times over. I learned the hard way that this is not the best solution to anything except truly blowing out your CPU cache....
Curt Sampson
+9  A: 

If you have a Haskell compiler that uses strict evaluation, it doesn't compile Haskell. Laziness is part of the Haskell spec!

However, there are alternatives.

  • DDC is an attempt to create an explicitly lazy variant of Haskell which supports things like destructive update whilst retaining all the rest of Haskell's goodness. There is one problem: the compiler is currently only in the α-stage, although it seems to be at least usable.

  • Create a preprocessor, as others have done.

  • Learn to use Haskell “the right way”. If you can simplify your test case down to something which is publicly-displayable, you could post it on the Haskell-Café mailing list, where people are very helpful with these sorts of questions concerning the effects of non-strictness.

Porges
DDC looks really nice. Now I have to master the effect system instead of monads...
Zifre
+3  A: 

Using nfdata and rnf everywhere isn't a solution since it means repeatedly traversing large structures that have already been evaluated.

The introductory chapter of Ben Lippmeier's PhD thesis (about DDC) is about the best critique of Haskell that I've seen--it discusses issues of laziness, destructive update, monad transformers, etc. DDC has laziness but you have to request it explicitly, and it's considered an effect, which is tracked and managed by DDC's type-and-effect system.

solrize