Hello!
I want to update a large amount of SVN-versioned projects at once, using a script. It takes very long when running update jobs one by one.
So I tried to run the jobs in parallel. It seems to work, however I'm not sure if it's done correctly. Perhaps there are concurrency issues I didn't think of?
Please take a look at the script:
#!/bin/sh
time (
for f in `ls -d */`
do
(
OUTPUT=`svn update $f`
echo -e "= = = = = = = = = = $f \n$OUTPUT"
) &
done
wait
)
When I don't store the output first, it comes all mixed up.
Do you think, it's OK this way?
NOTE: The speed up was really about factor 20 for 40 projects, when there is not lot to update.