I'm trying to write an iPhone app that takes PNG tilesets and displays segments of them on-screen, and I'm trying to get it to refresh the whole screen at 20fps. Currently I'm managing about 3 or 4fps on the simulator, and 0.5 - 2fps on the device (an iPhone 3G), depending on how much stuff is on the screen.
I'm using Core Graphics at the moment and currently trying to find ways to avoid biting the bullet and refactoring in OpenGL. I've done a Shark time profile analysis on the code and about 70-80% of everything that's going on is boiling down to a function called copyImageBlockSetPNG, which is being called from within CGContextDrawImage, which itself is calling all sorts of other functions with PNG in the name. Inflate is also in there, accounting for 37% of it.
Question is, I already loaded the image into memory from a UIImage, so why does the code still care that it was a PNG? Does it not decompress into a native uncompressed format on load? Can I convert it myself? The analysis implies that it's decompressing the image every time I draw a section from it, which ends up being 30 or more times a frame.
Solution
-(CGImageRef)inflate:(CGImageRef)compressedImage
{
size_t width = CGImageGetWidth(compressedImage);
size_t height = CGImageGetHeight(compressedImage);
CGContextRef context = NULL;
CGColorSpaceRef colorSpace;
int bitmapByteCount;
int bitmapBytesPerRow;
bitmapBytesPerRow = (width * 4);
bitmapByteCount = (bitmapBytesPerRow * height);
colorSpace = CGColorSpaceCreateDeviceRGB();
context = CGBitmapContextCreate (NULL,
width,
height,
8,
bitmapBytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast);
CGColorSpaceRelease( colorSpace );
CGContextDrawImage(context, CGRectMake(0, 0, width, height), compressedImage);
CGImageRef result = CGBitmapContextCreateImage(context);
CFRelease(context);
return result;
}
It's based on zneak's code (so he gets the big tick) but I've changed some of the parameters to CGBitmapContextCreate to stop it crashing when I feed it my PNG images.