The following *nix command pipes a hex representation of an IP and port (127.0.0.1:80) into the hexdump command.
printf "\x7F\x00\x00\x01\x00\x50" | hexdump -e '3/1 "%u." /1 "%u:" 1/2 "%u" "\n"'
The -e flag allows an arbitrary format to parse the input. In this case, we are parsing the first three octets of the IP into unsigned decimals followed by a dot. The final octet is also parsed into an unsigned decimal but it is followed by a colon. Finally -- and this is where the problem lies -- the 2 bytes for the port are parsed as a single unsigned decimal followed by a newline.
Depending on the endianness of the system executing this command, the result will differ. A big-endian system will properly show port 80; whereas a little-endian system will show port 20480.
Is there any way to manipulate hexdump to be aware of endianness while still allowing the arbitrary format specification via -e?