tags:

views:

13

answers:

1

I was using GetDIBits to get bitmap data from a screen compatible device context into a DIB of a certain format. I was under the impression that the DC was necessary only for synthesizing a color table when the source bitmap is 8 bits-per-pixel or less. Since my source bitmap was a full 32-bit color image and this was a one-off program and I didn't have the screen DC handy, I set the HDC parameter to NULL. This didn't work. Once I grabbed the screen DC and passed it in, it did start working.

That left me wondering why GetDIBits requires the device context. What is it used for?

A: 

In:

int SetDIBits(
  __in  HDC hdc,
  __in  HBITMAP hbmp,
  __in  UINT uStartScan,
  __in  UINT cScanLines,
  __in  const VOID *lpvBits,
  __in  const BITMAPINFO *lpbmi,
  __in  UINT fuColorUse
);

The second argument hbmp is the device dependent bitmap that will be altered using the color information from the device independent bitmap. The hdc is a handle to the device context on which this (device dependent) bitmap depends. When the call is made, Windows uses information from this device context to decide how to perform the transformation.

Michael Goldshteyn
"Color information" is vague. That could be a simple as referencing the palette, which should be irrelevant for 24 or 32 bpp images. It could be for color matching, but MSDN mentions color matching only on `SetDIBits`, not `GetDIBits`.
Adrian McCarthy