tags:

views:

319

answers:

3

Hello,

I've inherited someone else's monster of a BASH script. The script was written in such a way that it uses a ridiculous amount of memory (around 1GB). I can run it from a shell with out issue, but if I run it from cron I crashes with a sig fault.

Apart from digging into the poorly commented behemoth, is there a way to run it from cron with out running into the sig fault?

Cheers,

Steve

+1  A: 

When you run something using cron you'll encounter issues with the environment variables being different or simply not set as compared to your own variables when you manually execute. Often things like the PATH aren't set properly when cron executes something, so it's important to supply full paths to executables within the script, even for things such as perl or common commands that you thing should be found in the default PATH. Without more info it's hard to speculate on what precisely the problem is.

idontwanttortfm
Also look into LD_LIBRARY_PATH in the cron environment.
dmckee
+1  A: 

Is it expecting to be connected to a tty or have an open stdin? Try redirecting a file of something random to it as inout when it's running from cron?

What segfaults? Bash, or something it calls?

Any hints from the core file as to what the problem is?

Paul
A: 

try making sure stdout and stderr have somewhere to go

/path/to/bigscript.sh &> /dev/null

[Edit] you may want to use a file other than /dev/null, especially if your running it in debug mode ;)

Being so huge, I'm not if running in debug would help but you can try. In bash, it's the '-x' option, which you can just put in the shebang.

And as said in other answers, there's a good chance that it may be an environment variable.

JimB