views:

296

answers:

2

I want to generate a sound wave programmatically and play it with AVAudioPlayer. I have the code to encode my waveform as linear PCM, 44100Hz, mono, 8 bits per sample.

I am not clear on what kind of envelope I need to wrap around this buffer so that AVAudioPlayer recognizes it as PCM.

A: 

Maybe adding a WAV header would help?

zoul
I wonder if this is talking from experience or speculation.
iter
I did not do try it, but it has to work.
zoul
Thank you for your clarification. I am thinking less along the lines of rolling my own WAV encoder and more along the lines of using AudioToolbox to do the lifting for me.
iter
There’s not much to a “WAV encoder,” all you have to do is write a simple header consisting of some 44 bytes or so. Then you can append the LPCM data you already have.
zoul
A: 

PCM is just a digital representation of an analog audio signal. Unfortunately, it doesn't encapsulate any of the metadata about the audio - channels, bit depth, or sample rate - all necessary to properly read said PCM data. I would assume AVAudioPlayer would accept this PCM data wrapped in an NSData object as long as you were able to set those variables manually in the AVAudioPlayer object. Unfortunately, those variables are read only, so even though the documentation says AVAudioPlayer can handle anything that Core Audio can handle, it has no way to handle raw LPCM data.

As stated by zoul, I would imagine that the easiest way to go about this is throwing in a WAV header, since the header informs AVPlayer of the above necessary variables. It's 44 bytes, is easily mocked up, and is defined nicely - I used the same definition given above to implement wav header encoding and decoding. Also, it's just prepended to your unmodified LPCM data.

rcw3