views:

164

answers:

2

I was wondering; is it possible to acquire write access to the graphics card's primary buffer though the windows api, yet still allow read access to what should be there? To clarify, here is what I want:

  1. Create a directx device on a window and hide it. Use the stencil buffer to apply an alpha channel to pixels not written to by my code.
  2. Acquire the entirety of current display adapter's buffer. I.e. have a pointer to a buffer, in the current bit depth and resolution, that contains the current screen without whatever I drew to the screen. I was thinking of, instead of hiding my window, simply use a LAYERED window and somehow acquire the buffer before my window's pixels get blitted to it.
  3. Copy the buffer acquired in step 2 into a new memory location
  4. Blit my directx's device's primary buffer to the buffer built in step 3
  5. Blit the buffer in step 4 to the screen
  6. GOTO 2

So the end result is drawing hardware accelerated 3D directly to the window's desktop while still rendering other applications.

A: 

Is there any reason why you wouldn't just do this normally with GDI and use windowed mode for DirectX? Why bother with full-screen mode when you need to render with a window?

MSN
A: 

There are better ways to create a window without borders. You might try experimenting with the dwStyle parameter of CreateWindow, for example. It looks as if you pass in WS_OVERLAPPED | WS_POPUP and it results in a borderless window, which is what you appear to want. (see this forum post).

I also think the term "borderless window" is not correct, because I'm hardly getting any results in Google for searches including those words.

Ricket