Apparently *(cra+3) is a char of value '\xdd'. Since a char is signed, it actually means -35 (0xdd in 2's complement), i.e. 0x...fffffdd. Restricting this to 16-bit gives 0xffdd, i.e. 65501.
You need to make it an unsigned char so it gives a number in the range 0–255:
num = (unsigned char)cra[3];
Note:
1. the signedness of char is implementation defined, but usually (e.g. in OP's case) it is signed.
2. the ranges of signed char, unsigned char and unsigned short are implementation defined, but again commonly they are -128–127, 0–255 and 0–65535 respectively.
3. the conversion from signed char to unsigned char is actually -35 + 65536 = 65501.