views:

525

answers:

6

The following code writes no data to the back buffer on Intel integrated video cards,for example, on a MacBook. On ATI cards, such as in the iMac, it draws to the back buffer. The width and height are correct (and 800x600 buffer) and m_PixelBuffer is correctly filled with 0xAA00AA00.

My best guess so far is that there is something amiss with needing glWindowPos set. I do not currently set it (or the raster position), and when I get GL_CURRENT_RASTER_POSITION I noticed that the default on the ATI card is 0,0,0,0 and the Intel it's 0,0,0,1. When I set the raster pos on the ATI card to 0,0,0,1 I get the same result as the Intel card, nothing drawn to the back buffer. Is there some transform state I'm missing? This is a 2D application so the view transform is a very simple glOrtho.

glDrawPixels(GetBufferWidth(), GetBufferHeight(), GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, m_PixelBuffer);

Any more info I can provide, please ask. I'm pretty much an OpenGL and Mac newb so I don't know if I'm providing enough information.

A: 

What does GetError() report?

sparkes
+2  A: 

I've always had problems with OpenGL implementations from Intel, though I'm not sure that's your problem this time. I think you're running into some byte-order issues. Give this a read and feel free to experiment with different constants for packing and color order.

http://developer.apple.com/documentation/MacOSX/Conceptual/universal_binary/universal_binary_tips/chapter_5_section_25.html

I know it's OSX guide, you can probably find similar OpenGL articles on other sites for other platforms. This should be applicable.

basszero
A: 

What does GetError() report?

GL_NO_ERROR :(

Chris Blackwell
+1  A: 

I've always had problems with OpenGL implementations from Intel

This is kind of what I'm worried about, but I have a hard time believing they'd screw up something as basic as glDrawPixels, and also, since I can "duplicate" the problem by changing the raster position vector, it makes me think it's my fault and I'm missing something basic.

I think you're running into some byte-order issues

That was my first inclination, and I've tried packing differently, with no result. I also tried packing the buffer with values that would present a usable alpha if swizzled, with no result. This is why I'm barking up the raster pos tree, but I'm still honestly not 100% sure. Note that I'm targeting only Intel Macs if that makes a difference.

Thanks for the link, it was a good read, and good to tuck away for future reference. I'd upmod but I can't until I get 3 more rep points :)

Chris Blackwell
A: 

It's highly unlikely that a basic function like glDrawPixels might not be working. Have you tried some really simple settings like GL_RGB or GL_RGBA for format and GL_UNSIGNED_BYTE or GL_FLOAT for type? If not, can you share with us the smallest possible program which replicates your problem?

Ashwin
+1  A: 

The default raster position should be (0,0,0,1), but you can reset it to make sure.

Just before calling glDrawPixels(), try

GLint valid;
glGet(GL_CURRENT_RASTER_POSITION_VALID, &valid);

This should tell you if the current raster position is valid. If it is, then this is not your problem.

Adrian