views:

2665

answers:

18

I know a few programmers who keep talking about Haskell when they are among themselves, and here on SO everyone seems to love that language. Being good at Haskell seems somewhat like the hallmark of a genius programmer.

Can someone give a few Haskell examples that show why it is so elegant / superior?

+8  A: 

Software Transactional Memory is a pretty cool way to deal with concurrency. It's much more flexible than message passing, and not deadlock prone like mutexes. GHC's implementation of STM is considered one of the best.

bmdhacks
+6  A: 

I couldn't give you an example, I'm an OCaml guy, but when I'm in such a situation as yourself, curiosity just takes hold and I have to download a compiler/interpreter and give it a go. You'll likely learn far more that way about the strengths and weaknesses of a given functional language.

Andy
Okay, I'll try that! Especially after reading all those interesting answers! :)
Don't forget to read the compiler's source code. That will also give you a lot of valuable information.
Jon Harrop
+7  A: 

For an interesting example you can look at: http://en.literateprograms.org/Quicksort_(Haskell)

What is interesting is to look at the implementation in various languages.

What makes Haskell so interesting, along with other functional languages, is the fact that you have to think differently about how to program. For example, you will generally not use for or while loops, but will use recursion.

As is mentioned above, Haskell and other functional languages excel with parallel processing and writing applications to work on multi-cores.

James Black
recursion is the bomb. that and pattern matching.
Ellery Newcomer
Getting rid of for and while loops is the hardest part for me when writing in a functional language. :)
James Black
Learning to think in recursion instead of loops was the hardest part for me too. When it finally sunk in, it was one of the greatest programming epiphanies I've ever had.
Chris Connett
Except that the working Haskell programmer rarely uses primitive recursion; mostly you use library functions like map and foldr.
Paul Johnson
But you need to understand what is happening under the hood, otherwise you may have a stack overflow.
James Black
I find it more interesting that Hoare's original quicksort algorithm was bastardized into this out-of-place list-based form apparently so that uselessly inefficient implementations could be written "elegantly" in Haskell. If you try to write a real (in-place) quicksort in Haskell, you'll find it is ugly as hell. If you try to write a competitively-performant generic quicksort in Haskell, you'll find it is actually impossible due to long-standing bugs in GHC's garbage collector. Hailing quicksort as a good example for Haskell beggars belief, IMHO.
Jon Harrop
+34  A: 

The way it was pitched to me, and what I think is true after having worked on learning on Haskell for a month now, is the fact that functional programming twists your brain in interesting ways: it forces you to think about familiar problems in different ways: instead of loops, think in maps and folds and filters, etc. In general, if you have more than one perspective on a problem, it makes you better enabled to reason about this problem, and switch viewpoints as necessary.

The other really neat thing about Haskell is its type system. It's strictly typed, but the type inference engine makes it feel like a Python program that magically tells you when you've done a stupid type-related mistake. Haskell's error messages in this regard are somewhat lacking, but as you get more acquainted with the language you'll say to yourself: this is what typing is supposed to be!

Edward Z. Yang
+12  A: 

What really sets Haskell apart is the effort it goes to in its design to enforce functional programming. You can program in a functional style in pretty much any language, but it's all too easy to abandon at the first convenience. Haskell does not allow you to abandon functional programming, so you must take it to its logical conclusion, which is a final program that is easier to reason about, and sidesteps a whole class of the thorniest types of bugs.

When it comes to writing a program for real world use, you may find Haskell lacking in some practical fashion, but your final solution will be better for having known Haskell to begin with. I'm definitely not there yet, but so far learning Haskell has been much more enlightening than say, Lisp was in college.

dasil003
+5  A: 

One thing I find very cool when dealing with algorithms or mathematical problems is Haskell's inherent lazy evaluation of computations, which is only possible due to its strict functional nature.

For example, if you want to calculate all primes, you could use

primes = sieve [2..]
    where sieve (p:xs) = p : sieve [x | x<-xs, x `mod` p /= 0]

and the result is actually an infinite list. But Haskell will evaluate it left from right, so as long as you don't try to do something that requires the entire list, you can can still use it without the program getting stuck in infinity, such as:

foo = sum $ takeWhile (<100) primes

which sums all primes less than 100. This is nice for several reasons. First of all, I only need to write one prime function that generates all primes and then I'm pretty much ready to work with primes. In an object-oriented programming language, I would need some way to tell the function how many primes it should compute before returning, or emulate the infinite list behavior with an object. Another thing is that in general, you end up writing code that expresses what you want to compute and not in which order to evaluate things - instead the compiler does that for you.

This is not only useful for infinite lists, in fact it gets used without you knowing it all the time when there is no need to evaluate more than necessary.

waxwing
This isn't entirely true; with the yield return behavior of C# (an object-oriented language), you can also declare infinite lists that are evaluated on demand.
Jeff Yates
Good point. You are correct and I should avoid stating what can and cannot be done in other languages so categorically.I think my example was flawed, but I still think you gain something from Haskell's way of lazy evaluation: it's really there by default and without any effort from the programmer. And this, I believe, is due to its functional nature and the absence of side effects.
waxwing
You may be interested to read why "sieve" is *not* the Sieve of Eratosthenes: http://lambda-the-ultimate.org/node/3127
Chris Conway
@Chris: Thanks, that was actually a quite interesting article! The above primes function is not the one I have been using for my own computations since it is painfully slow. Nevertheless, the article brings up a good point that checking all numbers for mod is really a different algorithm.
waxwing
+3  A: 

it has no loop constructs. not many languages have this trait.

Badri
ghci> :m + Control.Monadghci> forM_ [1..3] print123
jetxee
A: 

To air a contrarian view: Steve Yegge writes that Hindely-Milner languages lack the flexibility required to write good systems:

H-M is very pretty, in a totally useless formal mathematical sense. It handles a few computation constructs very nicely; the pattern matching dispatch found in Haskell, SML and OCaml is particularly handy. Unsurprisingly, it handles some other common and highly desirable constructs awkwardly at best, but they explain those scenarios away by saying that you're mistaken, you don't actually want them. You know, things like, oh, setting variables.

Haskell is worth learning, but it has its own weaknesses.

RossFabricant
While it's certainly true that strong type systems generally require you to adhere to them (that's what makes their strength useful), it's also the case that many (most?) existing HM-based type systems do, in fact, have some kind of 'escape hatch' as described in the link (take Obj.magic in O'Caml as an example, though I've never used it except as a hack); in practice, however, for many kinds of programs one never needs such a device.
Zach Snow
The question of whether setting variables is "desirable" depends on how much pain it is to use the alternative constructs vs how much pain is caused by the use of variables. That's not to dismiss the entire argument, but rather to point out that taking the statement "variables are a highly desirable construct" as an axiom is not the basis of a cogent argument. It just happens to be the way most people learn how to program.
dasil003
-1: Steve's statements are partly out of date but mostly just completely factually wrong. OCaml's relaxed value restriction and .NET's type system are some obvious counter examples to his statements.
Jon Harrop
Steve Yegge has an unreasonable bee in his bonnet about static typing, and not only is most of what he says about it wrong, he also keeps bringing it up at every available opportunity (and even some unavailable ones). You'd do well to trust only your own experience in this regard.
ShreevatsaR
Although I disagree with Yegge on static vs. dynamic typing, Haskell does have the Data.Dynamic type. If you want dynamic typing, you can have it!
jrockway
When and how to use type inference is I think something that is still being worked out by the Haskell community. I really don't doubt anyone that has written Haskell programs requiring undecidable instances can say perfection has been reached with GHC's type system. HM is here to stay, but I could see Haskell moving towards something more like Scala type inference (inference in functions, but not at the top level).
Jonathan Fischoff
@Jonathan: Moving away from ML and towards Scala and C# would be a huge step in the wrong direction...
Jon Harrop
+2  A: 

If you can wrap your head around the type system in Haskell I think that in itself is quite an accomplishment.

Dr. Watson
What is there to get? If you must, think "data" == "class" and "typeclass" = "interface"/"role"/"trait". It couldn't be simpler. (There isn't even "null" to mess you up. Null is a concept you can build into your type yourself.)
jrockway
There's a whole lot to get, jrockway. While you and I find it relatively straightforward, many people -- even many developers -- find certain kinds of abstractions very difficult to understand. I know many developers who still don't quite grasp the idea of pointers and references in more mainstream languages, even though they use them every day.
Gregory Higley
+24  A: 

You are kind of asking the wrong question.

Haskell is not a language where you go look at a few cool examples and go "aha, I see now, that's what makes it good!"

It's more like, we have all these other programming languages, and they're all more or less similar, and then there's Haskell which is totally different and wacky in a way that's totally awesome once you get used to the wackiness. But the problem is, it takes quite a while to acclimate to the wackiness. Things that set Haskell apart from almost any other even-semi-mainstream language:

  • Lazy evaluation
  • No side effects (everything is pure, IO/etc happens via monads)
  • Incredibly expressive static type system

as well as some other aspects that are different from many mainstream languages (but shared by some):

  • functional
  • significant whitespace
  • type inferred

As some other posters have answered, the combination of all these features means that you think about programming in an entirely different way. And so it's hard to come up with an example (or set of examples) that adequately communicates this to Joe-mainstream-programmer. It's an experiential thing. (To make an analogy, I can show you photos of my 1970 trip to China, but after seeing the photos, you still won't know what it was like to have lived there during that time. Similarly, I can show you a Haskell 'quicksort', but you still won't know what it means to be a Haskeller.)

Brian
I disagree with your first sentence. I was really impressed with a few examples of Haskell code initially and what really convinced me it was worth learning was this article: http://www.cs.dartmouth.edu/~doug/powser.htmlBut of course, this is interesting for a mathematician/physicist. A programmer looking into real world stuff would find this example ridiculous.
Rafael S. Calsaverini
@Rafael: That begs the question "what would a programmer looking into real world stuff be impressed by"?
Jon Harrop
Good question! I'm not a "real world" programmer so I don't know what they like. hahaha...I know what physicists and mathematicians like. :P
Rafael S. Calsaverini
+6  A: 

Part of the fuss is that purity and static typing enable for parallelism combined with aggressive optimisations. Parallel languages are hot now with multicore being a bit disruptive.

Haskell gives you more options for parallelism than pretty much any general purpose language, along with a fast, native code compiler. There is really no competition with this kind of support for parallel styles:

So if you care about making your multicore work, Haskell has something to say. A great place to start is with Simon Peyton Jones' tutorial on parallel and concurrent programming in Haskell.

Don Stewart
"along with a fast, native code compiler"?
Jon Harrop
I believe dons is referring to GHCI.
Gregory Higley
+2  A: 

I agree with others that seeing a few small examples is not the best way to show off Haskell. But I'll give some anyway. Here's a lightning-fast solution to Euler Project problems 18 and 67, which ask you to find the maximum-sum path from the base to the apex of a triangle:

bottomUp :: (Ord a, Num a) => [[a]] -> a
bottomUp = head . bu
  where bu [bottom]     = bottom
        bu (row : base) = merge row $ bu base
        merge [] [_] = []
        merge (x:xs) (y1:y2:ys) = x + max y1 y2 : merge xs (y2:ys)

Here is a complete, reusable implementation of the BubbleSearch algorithm by Lesh and Mitzenmacher. I used it to pack large media files for archival storage on DVD with no waste:

data BubbleResult i o = BubbleResult { bestResult :: o
                                     , result :: o
                                     , leftoverRandoms :: [Double]
                                     }
bubbleSearch :: (Ord result) =>
                ([a] -> result) ->       -- greedy search algorithm
                Double ->                -- probability
                [a] ->                   -- list of items to be searched
                [Double] ->              -- list of random numbers
                [BubbleResult a result]  -- monotone list of results
bubbleSearch search p startOrder rs = bubble startOrder rs
    where bubble order rs = BubbleResult answer answer rs : walk tries
            where answer = search order
                  tries  = perturbations p order rs
                  walk ((order, rs) : rest) =
                      if result > answer then bubble order rs
                      else BubbleResult answer result rs : walk rest
                    where result = search order

perturbations :: Double -> [a] -> [Double] -> [([a], [Double])]
perturbations p xs rs = xr' : perturbations p xs (snd xr')
    where xr' = perturb xs rs
          perturb :: [a] -> [Double] -> ([a], [Double])
          perturb xs rs = shift_all p [] xs rs

shift_all p new' [] rs = (reverse new', rs)
shift_all p new' old rs = shift_one new' old rs (shift_all p)
  where shift_one :: [a] -> [a] -> [Double] -> ([a]->[a]->[Double]->b) -> b
        shift_one new' xs rs k = shift new' [] xs rs
          where shift new' prev' [x] rs = k (x:new') (reverse prev') rs
                shift new' prev' (x:xs) (r:rs) 
                    | r <= p    = k (x:new') (prev' `revApp` xs) rs
                    | otherwise = shift new' (x:prev') xs rs
                revApp xs ys = foldl (flip (:)) ys xs

I'm sure this code looks like random gibberish. But if you read Mitzenmacher's blog entry and understand the algorithm, you'll be amazed that it's possible to package the algorithm into code without saying anything about what you're searching for.

Having given you some examples as you asked for, I will say that the best way to start to appreciate Haskell is to read the paper that gave me the ideas I needed to write the DVD packer: Why Functional Programming Matters by John Hughes. The paper actually predates Haskell, but it brilliantly explains some of the ideas that make people like Haskell.

Norman Ramsey
Whoops, it sure is. Added. If you'll delete your comment I'll delete this one...
Norman Ramsey
+4  A: 

I've spent the last year learning Haskell and writing a reasonably large and complex project in it. (The project is an automated options trading system, and everything from the trading algorithms to the parsing and handling of low-level, high-speed market data feeds is done in Haskell.) It's considerably more concise and easier to understand (for those with appropriate background) than a Java version would be, as well as extremely robust.

Possibly the biggest win for me has been the ability to modularize control flow through things such as monoids, monads, and so on. A very simple example would be the Ordering monoid; in an expression such as

c1 `mappend` c2 `mappend` c3

where c1 and so on return LT, EQ or GT, c1 returning EQ causes the expression to continue, evaluating c2; if c2 returns LT or GT that's the value of the whole, and c3 is not evaluated. This sort of thing gets considerably more sophisticated and complex in things like monadic message generators and parsers where I may be carrying around different types of state, have varying abort conditions, or may want to be able to decide for any particular call whether abort really means "no further processing" or means, "return an error at the end, but carry on processing to collect further error messages."

This is all stuff it takes some time and probably quite some effort to learn, and thus it can be hard to make a convincing argument for it for those who don't already know these techniques. I think that the All About Monads tutorial gives a pretty impressive demonstration of one facet of this, but I wouldn't expect that anybody not familiar with the material already would "get it" on the first, or even the third, careful reading.

Anyway, there's lots of other good stuff in Haskell as well, but this is a major one that I don't see mentioned so often, probably because it's rather complex.

Curt Sampson
Very interesting! How many lines of Haskell code went into your automated trading system in total? How did you handle fault tolerance and what kinds of performance results did you get? I have been thinking recently that Haskell has the potential to be good for low latency programming...
Jon Harrop
A: 

I agree with those that said that functional programming twists your brain into seeing programming from a different angle. I've only used it as a hobbyist, but I think it fundamentally changed the way I approach a problem. I don't think I would have been nearly as effective with LINQ without having been exposed to Haskell (and using generators and list comprehensions in Python).

Jacob
A: 

I tried Haskell and I loved it. To tell you the truth, I found it difficult to understand functional programming and since Haskell doesn't have any loops, it uses recursion extensively. But even though I was breaking my head trying to solve problems given by the book, I found it highly satisfactory. I didn't give you an example, but the answer is there: try it yourself.

Armandas
+10  A: 

This is the example that convinced me to learn Haskell (and boy am I glad I did).

-- program to copy a file --
import System

main = do
         --read command-line arguments
         [file1, file2] <- getArgs

         --copy file contents
         str <- readFile file1
         writeFile file2 str

OK, it's a short, readable program. In that sense it's better than a C program. But how is this so different from (say) a Python program with a very similar structure?

The answer is lazy evaluation. In most languages (even some functional ones), a program structured like the one above would result in the entire file being loaded into memory, and then written out again under a new name.

Haskell is "lazy". It doesn't calculate things until it needs to, and by extension doesn't calculate things it never needs. For instance, if you were to remove the writeFile line, Haskell wouldn't bother reading anything from the file in the first place.

As it is, Haskell realises that the writeFile depends on the readFile, and so is able to optimise this data path.

While the results are compiler-dependent, what will typically happen when you run the above program is this: the program reads a block (say 8KB) of the first file, then writes it to the second file, then reads another block from the first file, and writes it to the second file, and so on. (Try running strace on it!)

... which looks a lot like what the efficient C implementation of a file copy would do.

So, Haskell lets you write compact, readable programs - often without sacrificing a lot of performance.

Another thing I must add is that Haskell simply makes it difficult to write buggy programs. The amazing type system, lack of side-effects, and of course the compactness of Haskell code reduces bugs for at least three reasons:

  1. Better program design. Reduced complexity leads to less logic errors.

  2. Compact code. Less lines for bugs to exist on.

  3. Compile errors. Lots of bugs just aren't valid Haskell.

Haskell isn't for everyone. But everyone should give it a try.

Artelius
A: 

For me, the attraction of Haskell is the promise of compiler guaranteed correctness. Even if it is for pure parts of the code.

I have written a lot of scientific simulation code, and have wondered so many times if there was a bug in my prior codes, which could invalidate a lot of current work.

rpg
How does it guarantee correctness?
Jonathan Fischoff
The pure parts of code are much more safer than the impure ones. The level-of-trust/effort-invested is way higher.
rpg
@rpg: What gave you that impression?
Jon Harrop
A: 

I find that for certain tasks I am incredibly productive with Haskell.

The reason is because of the succinct syntax and the ease of testing.

This is what the function declaration syntax is like:

foo a = a + 5

That's is simplest way I can think of defining a function.

If I write the inverse

inverseFoo a = a - 5

I can check that it is an inverse for any random input by writing

prop_IsInverse :: Double -> Bool
prop_IsInverse a = a == (inverseFoo $ foo a)

And calling from the command line

jonny@ubuntu: runhaskell quickCheck +names fooFileName.hs

Which will check that all the properties in my file are held, by randomly testing inputs a hundred times of so.

I don't think Haskell is the perfect language for everything, but when it comes to writing little functions and testing, I haven't seen anything better. If your programming has a mathematical component this is very important.

Jonathan Fischoff
What problems are you solving and which other languages have you tried?
Jon Harrop
Real time 3d graphics for the mobile and the iPad.
Jonathan Fischoff