I'm about at my wit's end over this issue, due to hitting a good dozen other website (as well as this one) looking for a straight-forward answer to my problem...
First off, I'm trying to figure out a way, given a starting image created via [UIImage imageNamed:] to add decorations based on processing over time. (Think of a heat map.) I kind of realized early that I'd need to copy the image, since images called via imageNamed are in a special cache, so I'm copying the raw data and creating new images and contexts via the underlying Core Graphics API. While I've been able to copy the image and bind it to a UIView, I can't figure out how to change its contents afterwards... The following is what I'm currently up to in my code...
@interface HeatMapTestViewController : UIViewController {
CFDataRef imageData;
CGColorSpaceRef colorSpace;
CGContextRef context;
CGColorRef paintColor;
CGImageRef image;
}
- (IBAction) imageButtonTapped:(UIButton*)sender forEvent:(UIEvent*)event;
@end
@implementation HeatMapTestViewController
- (void) viewDidLoad {
[super viewDidLoad];
UIImage* backImage = [UIImage imageNamed:@"Grey Checkerboard.png"];
UIColor* backdrop = [UIColor colorWithPatternImage:backImage];
[[self view] setBackgroundColor:backdrop];
UIImage* startImage = [UIImage imageNamed:@"Starting Image.png"];
CGImageRef pixelmap = [startImage CGImage];
// http://www.iphonedevsdk.com/forum/iphone-sdk-development/34247-cgimage-pixel-array.html
CGDataProviderRef dataProvider = CGImageGetDataProvider(pixelmap);
imageData = CGDataProviderCopyData(dataProvider);
void* rawPixels = (void*)CFDataGetBytePtr(imageData);
colorSpace = CGColorSpaceCreateDeviceRGB();
// http://developer.apple.com/mac/library/qa/qa2001/qa1037.html
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(pixelmap);
if ( ( bitmapInfo & kCGBitmapAlphaInfoMask ) == kCGImageAlphaLast )
bitmapInfo = ( bitmapInfo & kCGBitmapByteOrderMask ) | kCGImageAlphaPremultipliedLast;
if ( ( bitmapInfo & kCGBitmapAlphaInfoMask ) == kCGImageAlphaFirst )
bitmapInfo = ( bitmapInfo & kCGBitmapByteOrderMask ) | kCGImageAlphaPremultipliedFirst;
context = CGBitmapContextCreate(
rawPixels,
CGImageGetWidth(pixelmap),
CGImageGetHeight(pixelmap),
CGImageGetBitsPerComponent(pixelmap),
CGImageGetBytesPerRow(pixelmap),
colorSpace,
bitmapInfo
);
CGFloat components[] = {1.0, 0.0, 0.0, 1.0};
paintColor = CGColorCreate(colorSpace, components);
image = CGBitmapContextCreateImage(context);
UIImage* newImage = [UIImage imageWithCGImage:image];
UIButton* button = (id)[[self view] viewWithTag:327];
[button setBackgroundImage:newImage forState:UIControlStateNormal];
}
- (IBAction) imageButtonTapped:(UIButton*)sender forEvent:(UIEvent*)event {
UITouch* touch;
CGPoint touchPoint;
touch = [[event touchesForView:sender] anyObject];
// Assuming we always just get one due to IB setting...
touchPoint = [touch locationInView:sender];
CGRect touchRect = CGRectZero;
touchRect.origin = touchPoint;
touchRect = CGRectInset(touchRect, -11.0, -11.0);
UIGraphicsPushContext(context);
CGContextSetFillColorWithColor(context, paintColor);
CGContextFillEllipseInRect(context, touchRect);
UIGraphicsPopContext();
UIImage* newImage = [UIImage imageWithCGImage:image];
[sender setBackgroundImage:newImage forState:UIControlStateNormal];
[sender setNeedsDisplayInRect:touchRect];
} // imageButtonTapped:forEvent:
@end
The real time tapping feedback is just for demo purposes... What I really plan on doing in the final product is to show the modified image on user request behind the screen he/she is currently looking at. My Google-fu has been failing me, I can't seem to find example code in Apple's documentation. I've even tried finding code examples of "painting" apps for iPhone, since this type of app seems to share underlying functionality of what I'm trying to accomplish. No dice. /-:
EDIT: After much more experimentation and research, I got something like this to work by taking a different approach...
@interface HeatMapTestViewController : UIViewController {
CGColorSpaceRef colorSpace;
CGColorRef paintColor;
}
- (IBAction) imageButtonTapped:(UIButton*)sender forEvent:(UIEvent*)event;
@end
@implementation HeatMapTestViewController
- (void) viewDidLoad {
[super viewDidLoad];
UIImage* backImage = [UIImage imageNamed:@"Grey Checkerboard.png"];
UIColor* backdrop = [UIColor colorWithPatternImage:backImage];
[[self view] setBackgroundColor:backdrop];
UIImage* startImage = [UIImage imageNamed:@"Starting Image.png"];
UIButton* button = (id)[[self view] viewWithTag:327];
[button setBackgroundImage:startImage forState:UIControlStateNormal];
colorSpace = CGColorSpaceCreateDeviceRGB();
CGFloat components[] = {1.0, 0.0, 0.0, 1.0};
paintColor = CGColorCreate(colorSpace, components);
}
- (IBAction) imageButtonTapped:(UIButton*)sender forEvent:(UIEvent*)event {
UITouch* touch;
CGPoint touchPoint;
touch = [[event touchesForView:sender] anyObject];
// Assuming we always just get one due to IB setting...
touchPoint = [touch locationInView:sender];
CGRect touchRect = CGRectZero;
touchRect.origin = touchPoint;
touchRect = CGRectInset(touchRect, -11.0, -11.0);
UIImage* image = [sender backgroundImageForState:UIControlStateNormal];
// http://www.ipodtouchfans.com/forums/showthread.php?t=132024
UIGraphicsBeginImageContext([image size]);
CGContextRef context = UIGraphicsGetCurrentContext();
[image drawInRect:CGRectMake(0.0, 0.0, [image size].width, [image size].height)];
// This repaints the entire image! Boo hiss!
CGContextSetFillColorWithColor(context, paintColor);
CGContextFillEllipseInRect(context, touchRect);
[sender setBackgroundImage:UIGraphicsGetImageFromCurrentImageContext()
forState:UIControlStateNormal];
UIGraphicsEndImageContext();
} // imageButtonTapped:forEvent:
@end
The big honking problem with this is noted by the comment. If I'm understanding the above correctly, I'm redrawing the entire contents of the image into a new one, just to update a small portion of it. That doesn't sound like something I can get away with 20-30 times a second.