tags:

views:

114

answers:

1

Lets pretend I have a program that needs an environment set. Let's just pretend its Perl and I want to modify the environment (to search for libraries a special spot). Every time I mess with the the standard way to do things in UNIX I pay a heavy price and I pay a penalty in flexibility. I know that by using a simple shell script I will inject an additional process into the process tree. Any process accessing its own process tree might be thrown for a little bit of a loop. Anything recursive to a nontrivial way would need to defend against multiple expansions of the environment. Anything resembling being in a pipe of programs (or closing and opening STDIN, STDOUT, or STDERR) is my biggest area of concern. What am I doing to myself?

+3  A: 

What am I doing to myself?

Getting yourself all het up over nothing?

Wrapping a program in a shell script in order to set up the environment is actually quite standard and the risk is pretty minimal unless you're trying to do something really weird.

If you're really concerned about having one more process around — and UNIX processes are very cheap, by design — then use the exec keyword, which instead of forking a new process, simply exec's a new executable in place of the current one. So, where you might have had

#!/bin/bash -
FOO=hello
PATH=/my/special/path:${PATH}
perl myprog.pl

You'd just say

#!/bin/bash -
FOO=hello
PATH=/my/special/path:${PATH}
exec perl myprog.pl

and the spare process goes away.

This trick, however, is almost never worth the bother; the one counter-example is that if you can't change your default shell, it's useful to say

$ exec zsh

in place of just running the shell, because then you get the expected behavior for process control and so forth.

Charlie Martin
I think I will accept this. I worry a lot.
ojblass