I still see advice about using the LPTSTR
/TCHAR
types, etc., instead of LPWSTR
/WCHAR
. I believe the Unicode stuff was well introduced at Win2k, and I frankly don't write code for Windows 98 anymore. (Excepting special cases, of course.) Given that I don't care about Windows 98 (or, even less, ME) as they're decade old OS, is there any reason to use the compatibility TCHAR
, etc. types? Why still advise people to use TCHAR
- what benefit does it add over using WCHAR
directly?
views:
88answers:
2If someone tells you to walk up to 1,000,000 lines of non-_UNICODE C++, with plenty of declarations using char
instead of wchar_t
or TCHAR
or WCHAR
, you had better be prepared to cope with the non-Unicode Win32 API. Conversion on a large scale is quite costly, and may not be something the source-o-money is prepared to pay for.
As for new code, well, there's so much example code out there using TCHAR that it may be easier to cut and paste, and there is in some cases some friction between WCHAR
as wchar_t
and WCHAR
as unsigned short
.
Who knows, maybe some day MS will add a UTF-32 data type under TCHAR?
Actually, the unicode versions of functions were introduced with Win32 in 1993 with Windows NT 3.1. In fact, on the NT based oses, almost all the *A functions just convert to Unicode and call the *W version internally. Also, support for the *W functions on 9x does exist through Microsoft Layer for Unicode.
For new programs, I would definately recommend using the TCHAR macros or WCHARs directly. I doubt MS will be adding support for any other character sizes during NT's lifetime. For existing code bases, I guess it would depend on how important it is to support Unicode vs cost of fixing it. The *A functions need to stay in Win32 forever for backward compatibility.