I am currently trying to obtain the alpha value of a pixel in a UIImageView. I have obtained the CGImage from [UIImageView image] and created a RGBA byte array from this. Alpha is premultiplied.
CGImageRef image = uiImage.CGImage;
NSUInteger width = CGImageGetWidth(image);
NSUInteger height = CGImageGetHeight(image);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
rawData = malloc(height * width * 4);
bytesPerPixel = 4;
bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(
rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big
);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), image);
CGContextRelease(context);
I then calculate the array index for the given alpha channel using the coordinates from the UIImageView.
int byteIndex = (bytesPerRow * uiViewPoint.y) + uiViewPoint.x * bytesPerPixel;
unsigned char alpha = rawData[byteIndex + 3];
However I don't get the values I expect. For a completely black transparent area of the image I get non-zero values for the alpha channel. Do I need to translate the co-ordinates between UIKit and Core Graphics - i.e: is the y-axis inverted? Or have I misunderstood premultiplied alpha values?
Update:
@Nikolai Ruhe's suggestion was key to this. I did not in fact need to translate between UIKit coordinates and Core Graphics coordinates. However, after setting the blend mode my alpha values were what I expected:
CGContextSetBlendMode(context, kCGBlendModeCopy);