Can someone please explain why this program outputs 0x00000004?
class AndAssignment
{
static void Main()
{
int a = 0x0c;
a &= 0x06;
Console.WriteLine("0x{0:x8}", a);
}
}
/*
Output:
0x00000004
*/
Can someone please explain why this program outputs 0x00000004?
class AndAssignment
{
static void Main()
{
int a = 0x0c;
a &= 0x06;
Console.WriteLine("0x{0:x8}", a);
}
}
/*
Output:
0x00000004
*/
0x0c = 1100 in binary
0x06 = 0110 in binary
& operation is a binary AND which sets a bit to 1 if it's set in both operands, so:
0x0c & 0x06 = 1100 & 0110 = 0100 = 0x04
You can use windows calculator to see how integers is presented in different forms (hex and binary in your case). More info.
thank you for your answer but how did you understand 0x0c = 1100
or 0100 = 0x04
You have to know the basics of converting from number bases. Decimals are base 10. Binary is base 2. Hexadecimal is base 16.
Look at the following table for hexadecimal:
16^0 = 1
16^1 = 16
16^2 = 256
16^3 = 4096
Hexadecimals have the following numbers: 1, 2, 3, 4, 5, 6, 7, 8, 9, a, b, c, d, e, f.
so you have: 0C in hex, or just C.
16^0 x c (or 12) = 12 in decimal.
16^1 x 0 = 0
convert 12 decimal into binary now. Im just gonna show you simple addition pattern for small numbers:
2^0 = 1
2^1 = 2
2^2 = 4
2^3 = 8
2^4 = 16
so to make 12 in binary, you need one group of (2^3) and one group of (2^2). Therefore you have
1100.
If you convert it to decimal just like you did with hex, you'll end up with 12.
0 x 2^0 = 0
0 x 2^1 = 0
1 x 2^2 = 4
2 x 2^3 = 8
total = 12.
Sara, he's converting between hexadecimal and binary. Have a read of this:
http://www.purplemath.com/modules/numbbase.htm
C (in hex) = 12 (in base ten) 1100 (in binary) = 12 (in base ten)