views:

553

answers:

4

i want to test how my application reacts to high-dpi settings. i don't just mean 120dpi. i want to test higher dpi settings, such as:

  • 150dpi
  • 300dpi
  • 600dpi
  • 1000dpi
  • 1200dpi

My development machine's video card cannot do the resolutions required to have 300dpi, (or even 150dpi for that matter).

Assuming the interface is designed to 'fit' on a display with 768 lines (e.g. 1024x768), the resolution required for the higher-dpi settings would be:

             Normal      Wide-Screen         Frame Buffer
  dpi     Resolution    Resolution    Zoom    Size (MiB)
=======  ============  ============  ======  ============ 
   96     1024 x  768   1280 x  768    100%      3.75     
  113     1200 x  900   1440 x  900    117%      4.96     
  120     1280 x  960   1536 x  960    125%      5.63
  131     1400 x 1050   1680 x 1050    137%      6.73
  150     1600 x 1200   1920 x 1200    156%      8.70
  300     3200 x 2400   3840 x 2400    313%     35.26
  600     6400 x 4800   7680 x 4800    625%    140.63
1,000    10667 x 8000  12800 x 8000  1,042%    390.63

The required resolutions get pretty high, even at 150dpi.

i was thinking of something along the line of a running the software on a VirtualPC, with the virtual machine running 6400x4800 - and then use VNC to connect to the virtual machine. It could then scale the content to fit my monitor. Although i lose the fidelity of a high-dpi display, i can at least look at it, interact with it (i.e. test it). But the s3 Trio 32/64 video card that VirtualPC emulates tops out at 1600x1200 (i.e. 150dpi).

i also wondered if maybe there is some virtual video card driver out there, that can act like a video card - capable of high-resolution, but displays itself scaled on my native desktop.

Any ideas?


References

+3  A: 

You need a video card and a monitor that supports 1920 x 1200. Many users have these, and they're a joy to use if you're a developer. If you have 1600 x 1200 and don't want to spend the money on a new monitor that's fine. Beyond that, unless you're working for Pixar, I don't see the need.

Robert Harvey
Not really my money to spend. And getting anyone here (where i work) to care about dpi settings is like pulling teeth - as is also relfected in comments on the OP.
Ian Boyd
Microsoft says:Testing at a 192 DPI display setting is optional, but it enables you to determine how "future proof" your application is. For Windows Vista applications, we recommend that you resolve issues you find at least for configurations up to 144 DPI
Robert Harvey
The highest i can run is 131dpi.
Ian Boyd
+1  A: 

As you're already aware, both the NVidia and ATI display cards allow you to create custom resolutions however never in a million years up to 12800 x 8000. Just to give you an idea of how much memory that would take ... it would require 45 times as much memory as a 1080 (1920x1200) video card. What you could do however is get a big honking rig and chain numerous cards together ... even still ... 12800 x 8000 would be something better suited for customer hardware and drivers under LINUX.

Nissan Fan
i was hoping for something more virtual :)
Ian Boyd
1997: i've heard that some GGI workstations have monitors that go up to 1600x1200!!!! zomgwtf i'll **never** have to worry about supporting those!
Ian Boyd
Are still using apps written in 1997 with no updates?
Byron Whitlock
Ian, I think it's very likely that in the near future we'll be seeing greater resolutions given the need for great clarity in different lighting conditions and the desire to achieve paper like quality from digital displays ... however 12800 x 8000 x 32bit color = ~14.5 GBs a second of data at a refresh rate of 75hz. That is just how much data needs to be piped out. Even just storing the raw screen res is a buffer of about 200MB.
Nissan Fan
i was just browsing NewEgg for monitors. Monitors increase in resolution but also increase in size. Rather than being higher-density they are just getting larger.
Ian Boyd
@Nissan Fan - 12,800 x 8,000 x 32-bits-per-pixel = 390 MB not 200 MB. It looks like you did 2 bytes instead of 4 bytes per pixel.The data rate would likely be at 60 Hz (LCDs usually max out at 60). So 390 MB * 60Hz = 22.9 GB/sec.
Will Bickford
+2  A: 

If your app's layout behaves the same at 96, 120, 144, 150 dpi then I think there's no need to test it for even higher DPI, since you will have already tested that it works well for uneven dpi increments.

Actually there are many setups high-dpi-friendly already on the market, like 1650x1050 15,4" or 1920x1050 at 16" displays in notebooks, which at 120dpi already show pixel-dependency problems and are pretty uncomfortable to work with at 96dpi already so working on higher-density display support is valid. Good for you!

Edit: I've been thinking. That may not be very real-time, but maybe if you tried handling WM_PRINT or WM_PRINTCLIENT messages in your windows and printed it to a file or at least tried to show a print preview of them using printer settings? Suddenly we're in at least 300dpi. Just an idea.

macbirdie
The *real* motivation for this is to make it blindingly obvious to colleagues a) *why* you should support high-dpi, and b) *where* you screwed it up. i run my PC at non-96 dpi, but i'm the only one. i need to make it **blindingly** obvious the fail.
Ian Boyd
There's the problem with the notion that people are opposed to getting a small display with high resolution, because they're afraid that the characters will be too small, so the demand for high-dpi is not as big as it should be. For now mostly notebook users are the primary target for high-dpi with their tiny displays and "disproportional" resolutions. Desktop monitors are still sized so they work well in 96dpi, so I'm afraid there's no good real-world argument to support high-dpi until people see a screwed app on their 240dpi 22" display.
macbirdie
A: 

Windows doesn't check to see if your monitor actually measures to the DPI you configure it for, so just attach the biggest monitor you can and start switching the setting.

I'm curious to know why you want to test such high resolutions, i.e. anything over 192. If you have an actual need for such high resolution, surely you have access to the hardware that will be running it?

Mark Ransom
The eal motivation for using such high resolutions it to easily see if i missed anything - and to point out to colleagues that things were missed, and here's why.
Ian Boyd