I am wondering if there is an easy way to calculate the text extent of a string (similar to GetTextExtentPoint32), but allows me to specify the DPI to use in the calcuation. In other words, is there a function that does exactly what TextGetExtentPoint32 does, but allows me to pass the DPI as a parameter, or a way to "trick" TextGetExtentPoint32 into using a DPI that I can specify?
Before you ask "Why on earth do you want to do that?", I'll try to explain, but bear with me, the reasons behind this request are somewhat involved.
Ultimately, this is for a custom word-wrap algorithm that breaks a long string into smaller blocks of text that need to fit neatly on a Crystal Report with complex text layout requirements (it mimics the paper form used by police officers to file criminal complaints, so the state is in charge of the layout, not us, and it has to match the paper form almost exactly).
It's impossible for Crystal Reports to lay this text out properly without help (the text has to fit inside a small box on one page, followed by "standard-sized" continuation pages if the text overflows the small block), so I wrote code to break the text into multiple "blocks" that are stored in the reporting database and then rendered individually by the report.
Given the required dimensions (in logical inches), and the font information, the code first fits the text to the required width by inserting line breaks, then breaks it into correctly-size blocks based on the text height. The algorithm uses VB6's TextHeight and TextWidth functions to calculate extents, which returns the same results that the GetTextExtentPoint32 function would (I checked).
This code works great when the display is set to 96dpi, but breaks at 120 DPI: Some lines end up with more words in them they would have had at 96 DPI.
For example, "The quick brown fox jumps over the lazy dog" might break as follows:
At 96 DPI
The quick brown fox jumps over
the lazy dog
At 120 DPI
The quick brown fox jumps over the
lazy dog
This text is then further broken up by Crystal Reports, since the first line no longer fits in the corresponding text field on the report, so the actual report output looks like this:
The quick brown fox jumps over
the
lazy dog
At first, I thought I could compensate for this by scaling the results of TextHeight and TextWidth down by 25%, but apparently life isn't so simple: it seems numerous rounding errors creep in (and possibly other factors?), so that the text extent of any given string is never exactly 25% larger at 120 DPI compared to 96 DPI. I didn't expect it to scale perfectly, but it's not even close at times (in one test, the width at 120 DPI was only about 18% bigger than at 96 DPI).
This doesn't happen to any text on the report that is handled directly by Crystal Report: it seems to do a very good job of scaling everything so that the report is laid out exactly the same at 96 DPI and 120 DPI (and even 144 DPI). Even when printing the report, the text is printed exactly as it appears on the screen (i.e. it truly seems to be WYSIWYG).
Given all of this, since I know my code works at 96 DPI, I should be able to fix the problem by calculating all my text extents at 96 DPI, even if Windows is currently using a different DPI setting.
Put another way, I want the result of my FitTextToRect function to return the same output at any DPI setting, by forcing the text extents to be calculated using 96 DPI. This should all work out since I'm converting the extents back to inches and then comparing them against required width and height in inches. I think it just so happens that 96 DPI produces more accurate results than 120 DPI when converting back and forth between pixels and inches.
I've been pouring over the Windows Font and Text Functions, seeing if could roll my own function to calculate text extent at a given DPI, looking at GetTextMetrics and other functions to see how easy or difficult this might be to do. If there is an easier way to accomplish this, I'd love to know before I start creating my own versions of existing Windows API functions!