views:

1596

answers:

1

Due to performance issues, I have had to transfer my android opengl code from Java to C. I believe I transfered all of the OpenGL code, but I now have many errors with the section of my code that draws a bitmap as a texture to the screen.

When I run the project in an emulator, the current code does not display anything and appears to have a memory leak because it slowly takes up all the memory and forces other apps to close.

Here is the code that controls this part:

In the header file:

extern unsigned char colors[1024*512*3];   // 3 bytes per pixel

In the c file:

void appRender(int width, int height, unsigned char colors)
{
    unsigned int textureID;
    float vertices[] = 
    {
        0.0f, 0.0f,
        512.0f, 0.0f,
        0.0f, 1024.0f,
        512.0f, 1024.0f
    };

    float texture[] =
    {
        0.0f, 0.0f,
       1.0f, 0.0f,
       0.0f, 1.0f,
       1.0f, 1.0f
    };

    unsigned char indices[] = 
    {
       0, 1, 3,
       0, 3, 2
    };

    UpdateView();


    //onDraw 
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glOrthof(0.0f, 320.0f, 430.0f, 0.0f, 1.0f, -1.0f);

    //texture stuff
    glGenTextures(1,&textureID);
    glBindTexture(GL_TEXTURE_2D, textureID);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); 
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

    //Different possible texture parameters, e.g
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);

    glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, 512, 1024, 0, GL_RGB ,GL_UNSIGNED_BYTE, colors);

    glEnableClientState(GL_VERTEX_ARRAY);
    glEnableClientState(GL_TEXTURE_COORD_ARRAY);

    glVertexPointer(2, GL_FLOAT, 0, vertices);
    glTexCoordPointer(2, GL_FLOAT, 0, texture);

    glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_BYTE, indices);

    glDisableClientState(GL_VERTEX_ARRAY);
    glDisableClientState(GL_TEXTURE_COORD_ARRAY);
}

Any help would be greatly appreciated.

+1  A: 

There does not appear to be a call to glEnable(GL_TEXTURE_2D) to enable texturing in your example.

You can call glGetError() to find out if what you are doing is incorrect. This has helped me debug problems in the past.

Also you appear to be creating your texture in your appRender() method. If this is called for every frame you draw then it could be the cause of your memory leak as you are repeatedly recreating the same texture.

Typically you should only generate and define the texture once during initialization.

So the following should be done once before rendering.

glGenTextures(1,&textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

//Different possible texture parameters, e.g
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);

glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, 512, 1024, 0, GL_RGB ,GL_UNSIGNED_BYTE, colors);

Then when drawing you can do

glEnable(GL_TEXTURE_2D)
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glBindTexture(GL_TEXTURE_2D, textureID);

glVertexPointer(2, GL_FLOAT, 0, vertices);
glTexCoordPointer(2, GL_FLOAT, 0, texture);

glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_BYTE, indices);

glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisable(GL_TEXTURE_2D)
Aaron
I tried that, and it got rid of my memory leak, but I still can't get anything to draw. Nothing shows up on the screen whatsoever.
EnderX
Edit: I could see stuff every time I ran the initialization. Looks like I'm missing something from the the first part in the second part?
EnderX