So this is a really strange problem. I have a Java app that acts as a server, listens for and accepts incoming client connections, and then read data (XML) off of the socket. Using my Java client driver, everything works great. I receive messages as expected. However, using my C++ client driver on the first message only, the very first character is read to be an ASCII 0 (shows up like a little box). We're using the standard socket API in C++, sending in a char* (we've done char*, std::string, and just text in quotes).
I used Wireshark to sniff the packet and sure enough, it's in there off of the wire. Admittedly, I haven't done the same on the client computer. My argument is that it really shouldn't matter, but correct me if that assumption is incorrect.
So my question: what the heck? Why does just the first message contain this extra prepended data, but all other messages are fine? Is there some little trick to making things work?