tags:

views:

162

answers:

2

Hi,

How would you go about converting the following C #define into c#.

        #define get16bits(d) (*((const uint16_t *) (d)))
        #if !defined (get16bits)
        #define get16bits(d) ((((uint32_t)(((const uint8_t *)(d))[1])) << 8)\
                  +(uint32_t)(((const uint8_t *)(d))[0]) )
        #endif

I know you probably replace the uint32_t which with UInt32 change the other types to c# equivalent, but how proceed from making the above a static method. Would that be the best way of going about it.

Bob.

+2  A: 

I do not know why you are checking to see if get16bits is defined immediately after you define it, as the only way it would not be is a preprocessor error which would stop your compile.

Now, that said, here's how you translate that godawful macro to C#:

aNumber & 0xFFFF;

In fact, here's how you translate that macro to C:

a_number & 0xFFFF;

You don't need all this casting wizardry just to get the lower 16 bits of a number. Here's more C defines to show you what I'm talking about:

#define   getbyte(d)     (d & 0xFF)
#define   getword(d)     (d & 0xFFFF)
#define   getdword(d)    (d & 0xFFFFFFFF)
#define   gethighword(d) ((d & 0xFFFF0000) >> 16)
#define   gethighbyte(d) ((d & 0xFF00) >> 8)

Have you really used that macro in production code?

Jed Smith
Hi Jed, well they are called by a hash function which is in C. I never wrote it, but I want to use it, as c#
scope_creep
Those macros made me wash my hands after reading them. What are you porting to C#? Maybe it's already been done.
Jed Smith
Lol, I thought they were a bit funky as well. Its not been converted to C by anybody. I orginally created a C# to C interop call, but I would have ended up tiny dll, to look after, so I thought I would convert it to C#.
scope_creep
A: 

Getting the lower 16 bits of an integer is quite easy in C#:

int x = 0x12345678;
short y = (short)x; // gets 0x5678

If you want a static method for doing it, it's just as simple:

public static short Get16Bits(int value) {
   return (short)value;
}
Guffa
Can I ask why you go through the trouble of casting instead of using straight logic (which is probably much less expensive)?
Jed Smith
Guffa
@Guffa: Just different perspectives, I guess -- coming from a C world, I'd expect that a cast would result in an object being allocated, initialized, and given the value (instead of potentially a one-clock instruction for the logic). I was just curious.
Jed Smith