views:

108

answers:

3
#define __T(x)      L ## x

Found in code from one of the MFC source header file. It is mostly used for converting strings to ........ (I don't know what). If I am correct it converts strings to LPCTSTR...don't know what that type is either...

I can't seem to convert char* into LPCTSTR. While MFC file handling, the following code will always return error while trying to open the file...

    char* filepath = "C:\\Program Files\\Microsoft Office\\Office12\\BITMAPS\\STYLES\\GLOBE.WMF";

    if( !file.Open((LPCTSTR)filepath , CFile::modeRead, &fexp) )
    {
        fexp.ReportError();
        return 1;
    }

But instead if I wrote it this way, it doesn't give error:

    if( !file.Open( _T("C:\\Program Files\\Microsoft Office\\Office12\\BITMAPS\\STYLES\\GLOBE.WMF") , CFile::modeRead, &fexp) )
    {
        fexp.ReportError();
        return 1;
    }

I am looking at passing a variable as the first argument to the CFile::Open() method.

+1  A: 

The macro is simply stringizing L with the argument so that:

_T("xyz")

becomes:

L"xyz"

This is the way to make a wstring but, in the non-Unicode versions, _T will map to nothing, so you'll get regular strings there.

paxdiablo
+1  A: 

_T() allows you to set up your string literals so that you can build as either Unicode or non-unicode.

In non-unicode builds it evaluates to nothing so a string literal is represented as "XYZ" which is a normal narrow string. In a unicode build it evaluates to L (L"XYZ") which tells the compiler that the string literal is a wide character string. This and the various "T" string typedefs LPCTSTR etc. Allow you to write code that builds correctly for unicode and non-unicode builds.

Note that google is your friend, simply typing _T into google gives several useful results...

Len Holgate
+2  A: 

Hi,

The ## operator is a preprocessor concatenation operator. That is, this is valid code:

#define DECLARE_PTR(X) typedef std::auto_ptr<X> X##Ptr
DECLARE_PTR(int); // gets expanded to typedef std::auto_ptr<int> intPtr
intPtr i(new int(1));

In your case, the _T macro prepends the Long conversion symbol (L) to the input given. This only works with string literals. That means you can't write

char* str = "ABC";
wchar_t* wstr = _T(str); // error: Lstr is undefined

but you can safely write

char* str = "ABC";
LPTSTR wstr = _T("ABC"); // OK, gets expanded to wchar_t * wstr = L"ABC";
                         // when UNICODE is defined
                         // and char * wstr = "ABC"; when unicode is not defined

The L operator is a convertor of char and char* literals to a Long representation (from byte-wide representation to sizeof(wchar_t)-wide representation).

utnapistim
still want to know how to convert an ANSI string to unicode programmatically
deostroll
Then perhaps you should mention that in your question...
Len Holgate
Have a look at `MultiByteToWideChar` and `WideCharToMultiByte`. If you're using WinAPI in your code, these two should suffice. Otherwise:- to convert ASCII chars from `char*` to `wchar_t*` just cast each char (pseudocode): for_each(chr in [source char*]) dest += `wchar_t( chr );`- to convert from `wchar_t*` to `char*` do the same, as long as the `char` values are only in the ANSI subset (convert each `wchar_t` to `char` and append the value to the destination `char*`).- if you want generic character conversion you need to handle different code pages (i.e. I don't know :( ).
utnapistim