views:

567

answers:

4

I have just started learning Erlang and am trying out some Project Euler problems to get started. However, I seem to be able to do any operations on large sequences without crashing the erlang shell.

Ie.,even this:

list:seq(1,64000000).

crashes erlang, with the error:

eheap_alloc: Cannot allocate 467078560 bytes of memory (of type "heap").

Actually # of bytes varies of course.

Now half a gig is a lot of memory, but a system with 4 gigs of RAM and plenty of space for virtual memory should be able to handle it.

Is there a way to let erlang use more memory?

+1  A: 

Possibly a noob answer (I'm a Java dev), but the JVM artificially limits the amount of memory to help detect memory leaks more easily. Perhaps erlang has similar restrictions in place?

Draemon
+2  A: 

Also, both windows and linux have limits on the maximum amount of memory an image can occupy As I recall on linux it is half a gigabyte.

The real question is why these operations aren't being done lazily ;)

Marcin
+11  A: 

Your OS may have a default limit on the size of a user process. On Linux you can change this with ulimit.

You probably want to iterate over these 64000000 numbers without needing them all in memory at once. Lazy lists let you write code similar in style to the list-all-at-once code:

-module(lazy).
-export([seq/2]).

seq(M, N) when M =< N ->
    fun() -> [M | seq(M+1, N)] end;
seq(_, _) ->
    fun () -> [] end.

1> Ns = lazy:seq(1, 64000000).
#Fun<lazy.0.26378159>
2> hd(Ns()).
1
3> Ns2 = tl(Ns()).
#Fun<lazy.0.26378159>
4> hd(Ns2()).
2
Darius Bacon
+2  A: 

This is a feature. We do not want one processes to consume all memory. It like the fuse box in your house. For the safety of us all.

You have to know erlangs recovery model to understand way they let the process just die.

Flinkman