I'm enumerating Windows fonts like this:
LOGFONTW lf = {0};
lf.lfCharSet = DEFAULT_CHARSET;
lf.lfFaceName[0] = L'\0';
lf.lfPitchAndFamily = 0;
::EnumFontFamiliesEx(hdc, &lf,
                     reinterpret_cast<FONTENUMPROCW>(FontEnumCallback),
                     reinterpret_cast<LPARAM>(this), 0);
My callback function has this signature:
int CALLBACK FontEnumerator::FontEnumCallback(const ENUMLOGFONTEX *pelf,
                                              const NEWTEXTMETRICEX *pMetrics,
                                              DWORD font_type,
                                              LPARAM context);
For TrueType fonts, I typically get each face name multiple times.  For example, for multiple calls, I'll get pelf->elfFullName and pelf->elfLogFont.lfFaceName set as "Arial".  Looking more closely at the other fields, I see that each call is for a different script.  For example, on the first call pelf->elfScript will be "Western" and pelf->elfLogFont.lfCharSet will be the numeric equivalent of ANSI_CHARSET.  On the second call, I get "Hebrew" and HEBREW_CHARSET.  Third call "Arabic" and ARABIC_CHARSET.  And so on.  So far, so good.
But the font signature (pMetrics->ntmFontSig) field for all versions of Arial is identical.  In fact, the font signature claims that all of these versions of Arial support Latin-1, Hebrew, Arabic, and others.
I know the character sets of the strings I'm trying to draw, so I'm trying to instantiate an appropriate font based on the font signatures. Because the font signatures always match, I always end up selecting the "Western" font, even when displaying Hebrew or Arabic text. I'm using low level Uniscribe APIs, so I don't get the benefit of Windows font fallback/font linking, and yet my code seems to work.
Does lfCharSet actually carry any meaning or is it a legacy artifact? Should I just set lfCharSet to DEFAULT_CHARSET and stop worrying about all the script variations of each face?
For my purposes, I only care about TrueType and OpenType fonts.