views:

19

answers:

1

The reporting tools will generate a huge numbers of reports/files in the file system (a Unix directory). There's a list of destinations (email addresses and shared folders) where a different set of reports/files (can have overlap) are required to be distributed at each destinations.

Would like to know if there's a way to efficiently manage this reports delivery using shell scripts so that the maintenance of the list of reports and destinations will not become a mess in future.

It's quite an open ended question, the constraint however is that it should work within the boundaries of managing the reports in a Unix FS.

A: 

You could always create a simple text file (report_locations.txt here) with names/locations where reports go i.e.

ReportName1;/home/bob
ReportName2;/home/jim,/home/jill
ReportName3;/home/jill,/home/bob

The report names will always be the first field in this example and locations where the corresponding reports should go follow, delimited by commas (or any other delimiter you like).

Then read that file with a shell script (I like to use for loops for this sort of operation):

#!/usr/bin/ksh93
for REPORT in $(cut -d";" -f1 report_locations.txt)
do
        LISTS=$(grep ${REPORT} report_locations.txt | cut -d";" -f2)
        for LIST in ${LISTS}
        do
                DIRS=$(echo ${LIST} | tr ',' '\n')
                for DIR in ${DIRS}
                do
                        echo "Copying ${REPORT} to ${DIR}"
                        cp -f ${REPORT} ${DIR}
                done
        done
done

The use of for loops may be a bit excessive (I get caught up in them), but it gets the job done.

Not sure this is what you would be looking for, but it is a starting point if anything. Don't hesitate to ask if you need any explanation of the code.

Sean
thanks. will try it out.
mossie