views:

86

answers:

3

Two questions: how can I write a shell variable from this script into its child script?

Are there any easier ways to do this?

If you can't follow what I'm doing, I'm:

1) starting with a list of directories whose names will be stored as values taken by $i

2) cd'ing to every value of $i and ls'ing its contents

3) echoing its contents into a new script with the name of the directory via cat

4) using echo and cat to write a new script that contains the ls'd values of $i and sends them all to a blogging email address called [email protected]

#/bin/sh
read -d '' commands <<EOF

#list of directories goes here
dir1
dir2
dir3
etc...    

EOF

for i in $commands
do

cd $SPECIALPATH/$i
echo ("#/bin/sh \n read -d '' directives <<EOF \n") | cat >> $i.sh
ls | cat >> $i.sh
echo ("EOF \n for q in $directives \n do \n uuencode $q $q | sendmail $i \n done \n") | cat >> $i.sh
# NB -- I am asking the script to write the shell variable $i into the new
# script, called $i.sh, as the email address specified, in the middle of an
# echo statement... I am well aware that it doesn't work as is
chmod +x $i.sh
./$i.sh    

done
+1  A: 

If the generated script is meant to be temporary, I would not use files. Besides, chmoding them to executable sounds unsafe. When I needed to parallel my scripting, I used a bash script to form a set of commands (in an array, split the array in two, then implode the array) to a single \n-separated string and then pass that to a new bash instance.

Basically, in bash:

for orig in "$@"
do
    commands="$commands echo \"echoeing stuff here for arguments $orig\" \n"
done 

echo -e $commands |bash 

And a small tip: if the script doesn't need supervising, throw in a & after the piped bash to make your first script quit and do the rest of the work forked background.

progo
+1  A: 

If you export a variable

export VAR1=FOO

it'll be present in any child processes.

If you take a look at the init scripts, /etc/init..d/* you'll notice that many source another file full of "external" definitions. You could set up a file like that and have your child script source these files.

Paul Rubel
+5  A: 

You are abusing felines a lot - you should simply redirect, rather than pipe to cat which appends.

You can avoid the intermediary $i.sh file by bundling all the output that goes to the file with a single I/O redirection that pipes direct into a shell - no need for the intermediate file to clean up (you didn't show that happening) or the chmod operation.

I would have done this using braces:

{
echo "..."
ls
echo "..."
} | sh

However, when I looked at the script in that form, I realized that wasn't necessary. I've left the initial part of your script unchanged, but the loop is vastly simpler like this:

#/bin/sh
read -d '' commands <<EOF

#list of directories goes here
dir1
dir2
dir3
etc...    

EOF

for i in $commands
do
    (
    cd $SPECIALPATH/$i
    ls |
    while read q
    do uuencode $q $q | sendmail $i
    done
    )
done

I'm assuming the sendmail command works - it isn't the way I'd try sending email. I'd probably use mailx or something similar, and I'd avoid using uuencode too (I'd use a base-64 encoding, left to my own devices):

    do uuencode $q $q | mailx -s "File $q" [email protected]

The script also uses parentheses around the cd command. It means that the cd command and what follows is run in a sub-shell, so the parent script does not change directory. In this case, with an absolute pathname for $SPECIALDIR, it would not matter much. But as a general rule, it often makes life easier if you isolate directory changes like that.

I'd probably simplify it still further for general reuse (though I'd need to add something to ensure that SPECIALPATH is set appropriately):

#/bin/sh

for i in "$@"
do
    (
    cd $SPECIALPATH/$i
    ls |
    while read q
    do uuencode $q $q | sendmail $i
    done
    )
done

I can then invoke it with:

script-name $(<list-of-dirs)

That means that without editing the script, it can be reused for any list of directories.


Intermediate step 1:

for i in $commands
do
    (
    cd $SPECIALPATH/$i
    {
    echo "read -d '' directives <<EOF"
    ls 
    echo "EOF"
    echo "for q in $directives"
    echo "do"
    echo "    uuencode $q $q | sendmail $i"
    echo "done"
    } |
    sh
    )
done

Personally, I find it easier to read the generated script if the code that generates makes the generated script clear - using multiple echo commands. This includes indenting the code.

Intermediate Step 2:

for i in $commands
do
    (
    cd $SPECIALPATH/$i
    {
    ls |
    echo "while read q"
    echo "do"
    echo "    uuencode $q $q | sendmail $i"
    echo "done"
    } |
    sh
    )
done

I don't need to read the data into a variable in order to step through each item in the list once - simply read each line in turn. The while read mechanism is often useful for splitting up a line into multiple variables too: while read var1 var2 var3 junk will read the first field into $var1, the second into $var2, the third into $var3, and if there's anything left over, it goes into $junk. If you've generated the data accurately, there won't be any junk; but sometimes you have to deal with other people's data.

Jonathan Leffler
Thank you. That's an excellent answer.
old Ixfoxleigh