I'm trying to extract the rgb components of a UIColor
in order to hand-build the pixels in a CGBitmapContext
. The following sample code works fine for most of the UIColor
constants but, confusingly, not all. To wit:
CGColorRef color = [[UIColor yellowColor] CGColor];
const float* rgba = CGColorGetComponents(color);
float r = rgba[0];
float g = rgba[1];
float b = rgba[2];
float a = rgba[3];
NSLog( @"r=%f g=%f b=%f a=%f", r, g, b, a);
The results for [UIColor yellowColor]
above are
r=1.000000 g=1.000000 b=0.000000 a=1.000000,
as expected.
[UIColor redColor]
gives
r=1.000000 g=0.000000 b=0.000000 a=1.000000,
again as expected. Similarly for blueColor
and greenColor
.
However, the results for [UIColor blackColor]
and [UIColor whiteColor]
seem completely anomalous, and I don't know what I'm doing wrong (if indeed I am).
To wit, [UIColor blackColor]
gives
r=0.000000 g=1.000000 b=0.000000 a=0.000000,
which is a tranparent green,
and [UIColor whiteColor]
gives
r=1.000000 g=1.000000 b=0.000000 a=0.000000,
which is a transparent yellow.
I'd appreciate it if somebody could either:
(1) explain what I'm doing wrong
(2) replicate my anomalous results and tell me it's not me, or
(3) hit me over the head with a big hammer so it stops hurting so much.
Howard