views:

622

answers:

1

Normally, when we want to load a texture for OpenGL ES with .png, we simply add the .png images into XCode. The .png files will be altered for optimization by XCode and these altered png files can loaded into OpenGL ES texture during the runtime.

However, what I am trying to do is quite different. I am trying to load a .png file that is not from the prebuilt/compile. The png file will be transmitted externally from UDP, and it will be in the form of array of bytes. I am very sure that the png is transferred correctly, but when it comes to displaying the png image in the form of the OpenGL ES texture, the image somehow shows incorrectly. The colors that are being sent are presented but the positions are somehow very incorrect. However, the position of the colors still retain some aspects of the original position. Here:

alt text

The left image shows the original .png, while the right shows the png being displayed on iPhone using OpenGL ES Texture. It looks more like the png data is not being decoded or incorrectly processed.

Below is OpenGL ES code for turning the image into texture:

- (void) setTextureFromImageByte: (uint8_t*)imageByte{

    if (self = [super init]){

        NSData* imageData = [[NSData alloc] initWithBytes: imageByte length: imageLength];

        UIImage* img = [[UIImage alloc] initWithData: imageData];

        CGImageRef image = img.CGImage;

        int width = 512;
        int height = 512;

        if (image){

            int tempWidth = (int)width, tempHeight = (int)height;

            if ((tempWidth & (tempWidth - 1)) != 0 ){
                NSLog(@"CAUTION! width is not power of 2. width == %d", tempWidth);
            }else if ((tempHeight & (tempHeight - 1)) != 0 ){
                NSLog(@"CAUTION! height is not power of 2. height == %d", tempHeight);
            }else{
                void *spriteData = calloc(width * 4, height * 4);

                CGContextRef spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width*4, CGImageGetColorSpace(image), kCGImageAlphaPremultipliedLast);

                CGContextDrawImage(spriteContext, CGRectMake(0.0, 0.0, width, height), image);

                CGContextRelease(spriteContext);

                glBindTexture(GL_TEXTURE_2D, 1);

                glTexSubImage2D(GL_TEXTURE_2D, 
                                0, 
                                0, 
                                0, 
                                320, 
                                435, 
                                GL_RGBA, 
                                GL_UNSIGNED_BYTE, 
                                spriteData);

                free(spriteData);

            }

        }else NSLog(@"ERROR: Image not loaded...");

        [img release];
        [imageData release];
    }
}

So does anyone knows how to deal with this? Is it because of iPhone only accepts altered png from XCode? What can we do in this case in order to make the png image be able to display correctly?

+1  A: 

What are the .PNG's properties? (bitcount / bits component / etc)

Alfons
Yes, I understand that, but on iPhone, when you invoke [[UIImage alloc] initWithData: imageData];, shouldn't it manage any image file type internally for you? It's written in the Apple Document. Sorry, but this doesn't help.
unknownthreat
Nope. What helps, if the image format can be read, but doesn't draw correctly, you have to convert it to for example RGBA32. See my answer here: http://stackoverflow.com/questions/2457116/iphone-changing-cgimagealphainfo-of-cgimage/2484912#2484912
Alfons