views:

484

answers:

4

I am trying to move my code from Java to C, and I have encountered a problem while trying to find a function in C that can take an array of ints and create a bitmap from it for OpenGL. In Java, I used

bitmap = Bitmap.createBitmap( {int array name} , w, h, Config.RGB_565);

Is there a similar function that I can use in C, or a workaround that I could use?

Also, if it matters, I am programming for Android.

Thanks!

+1  A: 

I fished through the documentation for BMP format a while ago to compose this Python implementation. You should be able to port this to C with little trouble.

Paul McGuire
+3  A: 

I am not specifically familiar with android but if I remember by OpenGL C bindings properly there is no Bitmap structure. Instead the various bitmap function (i.e. glBindTexture()) just take GLuint pointers (the C equivalent of an array).

In Java everything is a class so the bitmap has a class associated with it, but the bitmap class is mostly just a wrapper (and some functions) for the int array which actually contains your image.

In OpenGL specifically, we often times bind an array with the glBindTexture function, and then other texture functions take the name it's bound to to reference the correct 'state' for OpenGL.

On the other hand if you are trying to short circuit this process you can always use glTexImage2D

tzenes
GL_UNSIGNED_SHORT_5_6_5
slf
http://markmail.org/message/f5uejhuvd7yylej2
slf
A: 

The function you're looking for is simply glTextImage2D(). It uploads data from an array into OpenGL. You specify the source and destination formats you have and want, and it does the rest.

Use together with glBindTexture() to create a stored re-usable texture object, so you don't need to upload the texture data prior to using it every frame.

unwind
A: 

Here's what you need:

glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA , x_size, y_size, 0, GL_RGBA , GL_UNSIGNED_BYTE, (GLvoid *)your_data );

x_size and y_size must be powers of 2. for this code you need array of x_size*y_size 4-byte ints (or x_size*y_size*4 bytes). For testing just try:

unsigned char data[32*32*4];

for( int i=0; i< sizeof(data); ++i) data[i]=rand();
// Don't forget to #include <stdlib.h>

glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA , 32, 32, 0, GL_RGBA , GL_UNSIGNED_BYTE, (GLvoid *)data );

Other possible formats: GL_BGRA (for different byte order), GL_RGB (if you don't have alpha). Use glGenTextures / glBindTexture to remember texture setup for later use.

noop
Is there anyway I could do this with integers instead of bytes?
EnderX