views:

103

answers:

0

I'm working with the iPhone SpeakHere example, and I would like to be able to play audio from either the mic (as in the example) or from a wav file. I have working code to play from a particular wav file, which looks like this:

 NSString *path = [[NSBundle mainBundle] pathForResource:@"basketBall" ofType:@"wav"];  
    AVAudioPlayer* theAudio=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];  
 theAudio.delegate = self;  
 [theAudio play];

So I'm fine with actually getting the wav to play in the application (I can hook it up to a button, etc.) but I would like it to also behave the same way pushing the "Play" button does after recorded speech, in that it should be connected to the same visualization (which I have modified quite a bit, but essentially shows the current volume, among other things).

Thanks for your help!