tags:

views:

130

answers:

7

I want a bash command that i can pipe into that will sum a column of numbers: I just want a quick one liner that will do something essentially like this:

cat FileWithColumnOfNumbers.txt | sum

Anybody have one in their back pocket?

+2  A: 

Does two lines count?

awk '{ sum += $1; }
     END { print sum; }' "$@"

You can then use it without the superfluous 'cat':

sum < FileWithColumnOfNumbers.txt
sum   FileWithColumnOfNumbers.txt

FWIW: on MacOS X, you can do it with a one-liner:

awk '{ sum += $1; } END { print sum; }' "$@"
Jonathan Leffler
Ok that qualifies. I wound up doing the same thing basically only by piping it through perl. The final result (it's parsing a column out of a timesheet csv, and totaling):$ perl -e '$sum = 0; while(<>) {s/^.*,(.*)$/\1/gi; $sum += $_ unless /Cost/;} print $sum;' report_for_XXX_XXX_2010Jun01_to_2010Jun30.csvquick and dirty.
jskaggz
@jskaggz - see my answer for a bit shorter/simpler Perl version :)
DVK
+1  A: 

You can use bc (calculator). Assuming your file with #s is called "n":

$ cat n
1
2
3
$ (cat n | tr "\012" "+" ; echo "0") | bc 
6

The tr changes all newlines to "+"; then we append 0 after the last plus, then we pipe the expression (1+2+3+0) to the calculator

Or, if you are OK with using awk or perl, here's a Perl one-liner:

$perl -nle '$sum += $_ } END { print $sum' n
6
DVK
+1  A: 

Use a for loop to iterate over your file …

sum=0; for x in `cat <your-file>`; do let sum+=x; done; echo $sum
t6d
+12  A: 
paste -sd+ infile|bc
radoulov
Ok, that's friggin beautiful. Thanks!
jskaggz
Oooh! I like that! +1
Dennis Williamson
That is awesome! +1 from me as well!
Buggabill
you have my vote for conciseness.
ghostdog74
There should be a badge for this.
Amardeep
+1 from me for the most concise solution!
t6d
+1  A: 
while read -r num; do ((sum += num)); done < inputfile; echo $sum
Dennis Williamson
+2  A: 

I like the chosen answer. However, it tends to be slower than awk since 2 tools are needed to do the job.

$ wc -l file
49999998 file

$ time paste -sd+ file | bc
1448700364

real    1m36.960s
user    1m24.515s
sys     0m1.772s

$ time awk '{s+=$1}END{print s}' file
1448700364

real    0m45.476s
user    0m40.756s
sys     0m0.287s
ghostdog74
Good point! On SunOS 5.8 bc even core dumps with such a big input file (see my post below).
radoulov
awk is the correct tool for the job! The bc solution is OK but what happens when you need to sum two columns or perhaps filter out negative numbers. With awk you can easily and sensibly add in extra logic, with the bc solution you end up with piping through yet another command (cut or grep)
James Anderson
@radoulov. thanks. didn't know Solaris's bc balks on big inputs.
ghostdog74
+1  A: 

[a followup to ghostdog74s comments]

bash-2.03$ uname -sr
SunOS 5.8

bash-2.03$ perl -le 'print for 1..49999998' > infile

bash-2.03$ wc -l infile
 49999998 infile

bash-2.03$  time paste -sd+ infile | bc
bundling space exceeded on line 1, teletype
Broken Pipe

real    0m0.062s
user    0m0.010s
sys     0m0.010s

bash-2.03$ time nawk '{s+=$1}END{print s}' infile
1249999925000001

real    2m0.042s
user    1m59.220s
sys     0m0.590s
bash-2.03$ time /usr/xpg4/bin/awk '{s+=$1}END{print s}' infile
1249999925000001

real    2m27.260s
user    2m26.230s
sys     0m0.660s

bash-2.03$ time perl -nle'
  $s += $_; END { print $s }
   ' infile
1.249999925e+15

real    1m34.663s
user    1m33.710s
sys     0m0.650s
radoulov
+1 for pointing out the coredump in the bc version of this. I'm hesitant to change the "answer" of this question because the bc version works fine for me (i'm totalling up 30 numbers).
jskaggz