I am developing an Autoconf script for a project which should run on Linux and Mac, 32- and 64-bit. On Linux, detecting 64-bit is easy: you get x86_64
instead of i386
for build_cpu
. I can't figure out how to do it for the Mac: both 32- and 64-bit machines give i386
for the build_cpu
. Is there a way to detect the difference using Autoconf builtins?
Bonus question: on a 64-bit CPU is there a better way to programmatically detect whether a binary is 32- or 64-bit than the following?
file NAME_OF_BINARY | sed -e 's/.*[^0-9]\([0-9]*\)-bit.*/\1/g'