views:

62

answers:

2

I am trying to send a string terminated with \0 to a tcp socket but it seems the \0 does never reach its destination.

Using this code:

NSString* s=@"<b/> \0_";

uint8_t *buf = (uint8_t *)[s UTF8String];

int resCode = [outputStream write:buf maxLength:strlen((char *)buf)];

I only seem to send @"<b/> ". Probably the \0 is seen as the end of the uint8_t.

Can someone tell me how it should be done??

A: 

Why do you want to send ']0] character in your String.....

Can't you use any other character...because '\0' character is always treated as terminating character!

And if you really want to send it that way then send a different character and map it to '\0' when you receiv it!

SPatil
I can't, because the receiver has already been implemented that way. And it is being used with other applications.
dkk
It's perfectly fine and common to use \0 as a terminating character in string io.
Nikolai Ruhe
+3  A: 

Of course it sends it without \0. strlen will return you length of the string, not counting null-terminators. Change it to something like

[outputStream write:buf maxLength:(strlen((char *)buf) + 1)]

sha
You could add that the problem has got nothing to do with the `NSString` class, but only with the understanding of C`s string functions. Also, it's not necessary to terminate the NSString with the NUL character, that's done by `UTF8String`.
Nikolai Ruhe
The \0 character is not used to terminate a NSString, but to terminate a commando at the server.@sha: Thanks a lot, it worked like a charm =)
dkk
@dkk I understand, but since `UTF8String` returns a null terminated C string, you don't have to add that character in the `NSString`.
Nikolai Ruhe