views:

36

answers:

1

Hi,

I have written a voice streaming application in iPhone using AudioQue. At the audio recording starts I initiated the network connection and pass the instance of NSAudioOutStream to AudioInputCallback using inUserData reference.

void AudioInputCallback(
  void *inUserData, 
  AudioQueueRef inAQ, 
  AudioQueueBufferRef inBuffer, 
  const AudioTimeStamp *inStartTime, 
  UInt32 inNumberPacketDescriptions, 
  const AudioStreamPacketDescription *inPacketDescs) {
      RecordState* recordState = (RecordState*)inUserData;
      if(!recordState->recording) {
         NSLog(@"Record ending...");
      }
      else{
        [recordState->soStream write:inBuffer->mAudioData maxLength:inBuffer->mAudioDataByteSize];
        NSLog([NSString stringWithFormat:@"Count:%d Size:%d¥n", sentCnt++, inBuffer->mAudioDataByteSize]); 
      } 
      recordState->currentPacket += inNumberPacketDescriptions;
      AudioQueueEnqueueBuffer(recordState->queue, inBuffer, 0, NULL); 
  }

According to the init parameters of the AudioQueue the length of the inBuffer is 16000 bytes. However, in WIFi application works without any doubt. But in 3G network client-server commutation is not stable. Anybody has got the same experience or someone can suggest a tip to solve this.

A: 

One way for fixing this matter is use Queue to insert audio buffer (size of 16000 bytes) and initial some other thread to enqueue the buffers time by time and send to server. But anybody can tell me how to synchronize one queue among two thread.

chatcja