tags:

views:

239

answers:

5

As someone who did a lot of sh scripting twenty years ago, and now coming back to it again, I find I'm using techniques that are considered obsolete. I should take the time to read the "What's new", but I don't. and it's not terribly efficient. Examples:

 Instead of             use 

 tmpfile=/tmp/me$$      tmpfile=`mktemp`

 [ ]                     [[ ]]

What are changes you think are important comparing the original Bourne shell to BASH?

+2  A: 

This might not be a direct answer to your question, but I think some of these "pitfalls" come from sh habits:

http://wooledge.org:8000/BashPitfalls

+3  A: 

For scripting, unless there's a specific reason to do otherwise, I limit myself to the Bourne constructs. They are maximally portable, and should run on systems that use bash, ksh, or even sh as their default shell.

I find any actual performance differences to be minimal (stopwatch timing); if performance is both important and limited by the shell, I'll move the time-critical part to a compiled language. The extra capabilities of more modern shells are great, and I use them interactively or maybe for ad hoc quick scripts. If I'm going to distribute and maintain the code, however, I've found that ignoring the extensions saves me time and effort.

If you are comfortable with the Bourne shell syntax, and can make the script do what you want using Bourne shell, then don't bother with the extensions.

mpez0
+2  A: 

Instead of

tmpfile=`mktemp`

Use

tmpfile=$(mktemp)

Generally recommended practice ( it nests better )

Something I also noted with bash4.0's release is this situation in a few scripts.

foo="$( some   
          multiline 
          command  in a string)"

The problem since 4.0 I've been finding is that newlines need to be explicitly escaped in that case too, so

foo="$( some \ 
          multiline \
          command  in a string)"

Is recommended.

Kent Fredric
+3  A: 

Just a taste:

  • Forget [, learn [[:

    • No wordsplitting or pathname expansion happens on unquoted variables in [[.
    • You can use = to do glob pattern matching: [[ $foo = *.txt ]] (foo ends with .txt)
    • You can use =~ to do ereg pattern matching: ereg='.*.txt'; [[ $foo =~ $ereg ]]
    • You can use &&, || and ( ) inside the test: [[ $bar && ( $foo = *.txt || $foo = *.bar ) ]]
    • Gotcha: RHS of = is considdered a glob pattern: bar='I pinch??'; [[ "I pinched" = $bar ]] # test passes.
  • Use (( )) for everything numeric.

Eg.

(( ++count ))
(( $# )) || { echo "Expected an argument to the script." >&2; exit 1; }
  • Some lovely IO operators, such as <(), <<<, etc.

Eg.

read filesize _ < <(wc -c myfile)
openssl base64 <<< "Bar!" # as opposed to the more expensive: echo "Bar!" | openssl base64
content=$(<file) # as opposed to the more expensive: content=$(cat file)
  • Forget deprecated syntax, such as `` . The new syntax often has imporant advantages. $() nests easily (just you try to nest `` sanely), quoting inside $() works easily (again, it's a mess in `` ), etc.

Eg.

rm "$(grep -l foo <<< "$(</my/file.list)")"
  • Arrays, arrays, arrays. Whenever you need to keep multiple strings (like filenames!), keep them in arrays. Do not keep them in a single string that uses some kind of delimitor to separate your separate strings, this method is always flawed.

Eg.

files=(/foo/*); for file in "${files[@]}"; do pinch "$file"; done

For more, check out the following places. They are probably the single most useful and trustworthy Bash resources around:

lhunath
A: 

Adding to what's already here:

String maipulation in modern shells is better. With older sh, you have the ${var##pattern} and ${var%%pattern} contructs, but with bash and ksh93 (and other less-common shells) you get operators that will do string replacement (like ${var/source/replace}).

The extglob regular expression support is also fairly handy, though not exactly new, it's usually underutilized by old-school shell programmers

[[ "$str" = @(+([a-z])?([0-9])) ]]

to match strings which are either all alphanumeric or alphanumeric ending with one number. I use that all the time in ksh88...

Then there's variable variables (var=hello; hello=hi; echo $$var) and arrays which are large enough to actually be useful. Arrays are another of those things that old-school shell developers often underutilize, partially because of the 1024 element limit that existed in older shells - modern shells support at least 4096 elements, and most of them support associative arrays (perl calls them hashes).

dannysauer