tags:

views:

58

answers:

2

I've been running some memory intensive processes on EC2 servers. The code runs quite well for about 12-14 hours (it's running 1000s of simulations on 12-14 large datasets) and then all of a sudden I just see the message "Killed" with no further explanation.

What makes R do that?

UPDATE: My server specs.

A: 

From what I know, I don't think R has a "killed" error. Most likely it's your operating system imposing a process limit or some kind of quotas. If you are working on a network system, maybe ask your sysadmin?

Xzhsh
The entire server is mine. I have no limits so to speak. It's a server with 67 gigs of RAM and I batch the scripts so more than one isn't running at a time. So I am a little puzzled why it quits without further explanation. I source the script from within R.
Maiasaura
Hum... https://stat.ethz.ch/pipermail/r-help/2004-April/049212.html, afaik R doesn't have Killed as an error though... are you sure there isn't any default process limits? I'd check your settings. Also, make sure you aren't using the 32bit version of R, and check how much memory your installation of ubuntu sees with the free command
Xzhsh
Thanks Xzhsh. That was helpful.
Maiasaura
No problem, I hope you can resolve your issue some time soon.
Xzhsh
+2  A: 

It could be to out of memory killer of the operating system.

Are you cleaning up your workspace when you have finished with a dataset?

James