We have a game that is very close to being released, but we have one extremely annoying problem. One on of our Beta testers phones, he can't hear any of the in game sound effects. He can, however, hear the background music and the title screen music just fine.
The background and title music are both being played via AVAudioPlayer (they are longer, we need looping and volume control, etc). The sound effects are simply being played with AudioServicesPlaySystemSound (they are very short, we don't need precise control or to know when they end, etc). This works on most iPhones, but not on this one. All of this is being played with an audio session of AVAudioSessionCategorySoloAmbient.
So I have two questions: - First, is this an acceptable implementation? i.e. is there something I missed that says you can't mix these two frameworks, or a reason why its a bad idea to mix them? - Second, has anyone seen something like this before? If so, did you find a way around it?
Additional background note: I can pretty conclusively say that on his phone, it is the mixing of the two frameworks. He was able to hear sounds until roughly the same build where we added the title screen music. Also, if I change one of the sounds to work through an AVAudioPlayer, he's able to hear it. Unfortunately, I can't simply move the sounds into AVAudioPlayers because it just doesn't perform well at all, and I need better synchronization.