Hi,
I have built a iPhone application, which capture the voice and streamed to server using NSOutputStream
instance. Once the stream stops, iPhone sends a text message as "---end---" to the server and starts to listing on NSOutputStream
.
When server(built using C#) captured the "---end---" message, it does some processing and write back the text to the client. Once client receive the end message it close NSOutputStream and NSInputStream
connections.
Every time, before send the "end" message I put the application to sleep for 0.5 sec. Then I can guaranteed the end message is not mix with other data. This system works well with the simulator and in my network (100Mbps).
However, when I connected the whole application to slow network (1.2 Mbps), the whole communication went vanished. Sometimes the end message is mixing with voice data as similarly the end message is concatenated with the text sending from the server. But when I change the thread sleeping time for 3 sec in server and client, the error occurrence rate was reduced.
I know that this might be a issue of design my network client-server communication protocol. But I cannot figure out how to fix it exactly.
Can anybody explain me, how can I overcome from these issues ?