tags:

views:

27

answers:

0

In my program the user can switch between full-screen and windowed modes. The full-screen mode potentially changes the screen resolution, and switching to windowed mode changes it back to what it was previously.

At the program's termination the operating system automatically restores the screen resolution to it's default if it detects that the program changed the screen resolution. However, if it switched to full-screen then back to windowed (and back to the default resolution settings), the operating system still attempts to change it back to the default resolution, giving an unwanted black flicker. Is this the result of improper screen resolution switching code or is there some way I can assure the OS that the resolution is at the default and therefore doesn't need to change it?