views:

73

answers:

0

Hi!

I'm generating an image using quartz 2d and I want to use it as an opengl texture. The tricky part is that I want to use as few bits per pixel as possible, so I'm creating cgContext as following:

int bitsPerComponent = 5;
int bytesPerPixel = 2;
int width = 1024;
int height = 1024;
void* imageData = malloc(width * height * bytesPerPixel);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGImageContext context = CGBitmapContextCreate(imageData, width, height, bitsPerComponent, width * bytesPerPixel, colorSpace, kCGImageAlphaNoneSkipFirst);
//draw things into context, release memory, etc.

As stated in the documentation here, this is the only supported RGB pixel format for CGBitmapContextCreate which uses 16 bits per pixel. So now I want to upload this imageData which looks like "1 bit skipped - 5 bits red - 5 bits green - 5 bits blue" into an opengl texture. So I should do something like this:

glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_SHORT_5_5_5_1, imageData); 

That won't work because in this call I've specified pixel format as "5 red - 5 green - 5 blue - 1 alpha". That is wrong, but it appears that there is no format that would match core graphics output.
There are some other options like GL_UNSIGNED_SHORT_1_5_5_5_REV, but those wont work on the iphone.

I need some way to use this imageData as a texture, but I really don't want to swap bytes around manually using memset or such, because that seems terribly inefficient.