Hi All,
I would like my iPhone app to take a UIImage from the image library, image roll or live camera, and then create a new UIImage with the pixels and alpha channels of the pixels altered. (For example, if a source pixel is close to black, then make it more transparent than a brighter pixel.)
I have code that will create a CGImageRef and associated CGBitmapInfo, CGColorSpaceRef and other necessary structures from the source image. I can get a copy of the pixel data from CGDataProviderCopyData and iterate through the pixels, inspecting and adjusting as necessary.
I can even create a new UIImage from the altered pixels (using CGImageCreate).
When the image comes from the image library using a UIImagePicker (with editing allowed), I have no problem at all. In these cases, the original image had the kCGBitmapAlphaInfoMask set as well as a CGImageAlphaInfo of "kCGImageAlphaPremultipliedFirst" set.
But, when the image comes from the camera live (via the UIImagePicker again) or the camera roll, then the kCGBitmapAlphaInfoMask is still set and the bit order is now "kCGImageAlphaNoneSkipLast". Those last alpha bytes are hardwired to have a value of "25" for every pixel and modifying that value has no effect on the new image I've created.
What do I do with the data coming from the roll or the camera that has "kCGImageAlphaNonSkipLast" if I want to fiddle with the alpha values? The CGImageCreate function doesn't seem to have a way to specify alpha use, only the basic bit ordering in the CGBitmapInfo structure.
Said another way, how do I force a new UIImage to support alpha channels?
I haven't posted any code because this is still a "big picture" issue for me. If you would like, I can certainly post what I have so far.
-VTPete
Follow-up: I think I solved my own problem thanks to a similar post elsewhere. The trick is modifying the bitmapinfo parameter of the CGImageCreate function to be the OR'd values "kCGBitmapByteOrderDefault" and "kCGImageAlphaLast". The part that REALLY has me flummuxed is that the docs for CGBitmapInfo's values makes no mention of "kCGImageAlphaLast" (which is exactly what I was looking for.) (And what does "kCBitmapByteOrderDefault" REALLY mean, anyway?
Hope this post helps someone else!
Perhaps someone who understands these libraries can explain why some images are read in with alpha channels active and some aren't? I feel like I'm missing a big picture here and can't find a good resource that explains the mysteries behind the images coming from the UIImagePicker's various incarnations.