views:

41

answers:

1

My library needs to read-in big-endian integers (4-bytes) and covert them to the endian order of the host for processing. While on *nix ntohl has worked a treat under Windows use of ntohl requires the use of Ws2_32.dll (Winsock).

Such a dependency is one which I would rather eliminate. The simplest way to do this appears to be to write my own endian-swapping function (a trivial exercise, considering performance is not a real concern). However, this requires a way to determine the endianness of the system my library is being compiled on (so I can #ifdef out the swapping function on big endian systems).

As there appears to be no standard preprocessor definition for endianess it appears as if it is necessary to determine it using my build system (cmake). What is the best means of doing this? (I am weary of 'compile a test file and see' type solutions as they would appear to inhibit cross-compiling.)

+2  A: 

Edited: I see that cmake has a TestBigEndian.cmake script, but that does a compilation and test run to see if the system is Big Endian or not, which is not what you want to do.

You can check for the system's endianness in your own program with a function like this.

bool isLittleEndian()
{
    short int number = 0x1;
    char *numPtr = (char*)&number;
    return (numPtr[0] == 1);
}

Basically create an integer, and read its first byte (least significant byte). If that byte is 1, then the system is little endian, otherwise it's big endian.

However this doesn't allow you to determine endianness until runtime.

If you want compilation time determination of system endianness, I do not see much alternative aside from the 'building a test program and then compile my real program' a la cmake, or doing exhaustive checks for specific macros defined by compilers, e.g. __BIG_ENDIAN__ on GCC 4.x.

UPDATED You might want to take a look at Boost's own endian.hpp as an example, too. http://www.boost.org/doc/libs/1_43_0/boost/detail/endian.hpp

birryree