I am wracking my brains in trying to understand the discrepancy between the font sizes users select or specify (for example, using a FontDialog) and the em-size reported by the Font class in .NET.
For example:
using (FontDialog dlg = new FontDialog()) {
if (dlg.ShowDialog() == DialogResult.OK) {
Console.WriteLine("Selected font size: " + dlg.Font.SizeInPoints.ToString("0.##"));
}
}
Using the above code, you will get some confusing results:
Selecting 11 in the dialog produces 11.25
Selecting 12 in the dialog produces 12
Selecting 14 in the dialog produces 14.25
Selecting 16 in the dialog produces 15.75
This behaviour occurs regardless of the font you choose. As you can see from above, there is no pattern in the discrepancy, it seems to vary randomly between +0.25 and -0.25.
I get around this in user interfaces by only ever displaying the font size as a rounded whole number, but I swear that I have seen word processing/DTP packages that allow users to select fractional font sizes - and these packages do not show the above behaviour when interacting with Windows font dialog boxes.
Can anyone provide a rational explanation for this? Is there a best practice technique for displaying the font size in a UI? How about when the user wants a fractional size such as '10.5'?