I'm using stringWithFormat @"%ls" to do it and I only see the first character copied, which makes me think it's still assuming it's a single byte char.
Any ideas?
I'm using stringWithFormat @"%ls" to do it and I only see the first character copied, which makes me think it's still assuming it's a single byte char.
Any ideas?
Use initWithBytes:length:encoding
. You will have to know the encoding that wchar_t
uses, I believe it is UTF-32 on Apple platforms.
#if big endian
#define WCHAR_ENCODING NSUTF32BigEndianStringEncoding
#elif little endian
#define WCHAR_ENCODING NSUTF32LittleEndianStringEncoding
#endif
[[NSString alloc] initWithBytes:mystring length:(mylength * 4) encoding:WCHAR_ENCODING]
In general, I suggest avoid using wchar_t
if at all possible because it is not very portable. In particular, how are you supposed to figure out what encoding it uses?