views:

206

answers:

2

I have a Direct3D app which runs windowed or fullscreen at a fixed res (say 800x600). To support widescreen modes, I render to the back buffer at 800x600 and then use Blt to draw the final frame into a portion of the front buffer, which is usually bigger (say 1280x720), so the 800x600 image is stretched to 960x720 to maintain the aspect ratio.

This works fine, except in some video cards/OS/driver combination (nVidia Quadro, DX11, Windows 7) where the blit appears to be done using point sampling, resulting in jagged edges and a generally unsmooth final image.

Is there any way to avoid this? For example, force Blt to use a linear filter when scaling up?

(Note : I know I can render the original 800x600 assets to 960x720 instead of stretching at the end, but that has other drawbacks, so stretching at the end is the preferred solution)

+1  A: 

I don't think there is any way to control this. I read something saying that this behavour changed in windows7 with some drivers but I can't find the reference now.

You could perhaps render to a texture at 800x600 and then draw a full screen quad using this texture at the actual screen size. Then at least you could control the filtering.

John Burton
A: 

Have you considered rendering the scene to a texture and then rendering that texture to the backbuffer stretched? This will give you bilerping.

The reason it works on some machines is that this is exactly how the blit will be implemented by the driver

To get best results, though, you REALLY are betetr off just rendering to the proper sized backbuffer. ie if you want 1280x720 ... render to a 1280x720 back buffer with the appropriate field of view and aspect ratio modifications.

Goz
Yep, I did render-to-texture a couple of days ago and it's working fine. I render to a 800x600 target and then stretch the final quad, though - rendering directly to the final-sized buffer has other issues, as I commented in my original post.
ggambett