audioqueueservices

extracting mp3 file duration using AudioQueueServices

I have implemented a streaming mp3 player using AudioQueueServices, that downloads mp3s over an NSURLConnection. Playback, pausing, and seeking work great, however I can't figure out how to extract the duration of the audio from the mp3 files while they are still being downloaded. I'd like to be able to pull the info from the id3 tags...

Best way to learn iphone audio queue services, step by step tutorial

I'm trying to learn how to handle audio at a fairly low level with audio queue services. I have been progrmaing in memory managed languages for quite a while, and have just completed the c programing tutorial by vtc (2007). This has left me comfortable with the understanding of pointers and memory allocation, but the apple documention ...

Is there a FSRef in iPhone SDK or is there something that can be FSRef's alternative?

The question may sound stupid, but the thing is this: I am learning how to use a Audio Queue, and the example I've taken (aqtest) has been a nice guide for me until I recently found out that aqtest is not for iPhone. (stupid me) I served around the Internet and found out that there is no FSRef for iPhone. If possible, I want to find a w...

How to deal with forward declaration / #import in Cocoa Touch (Objective-C cross C++) correctly?

I am trying to write this header file: //@class AQPlayer; //#import "AQPlayer.h" @interface AQ_PWN_iPhoneViewController : UIViewController { AQPlayer* player; } @end AQPlayer is a .mm file written in C++. I tried to make a class forward declaration here, but it complains to me: error: cannot find interface declaration for '...

How do I get the filesystem path for a resource on iPhone?

On the iPhone I need to get the path for the resource. OK, done that, but when it comes to CFURLCreateFromFileSystemRepresentation thing, I just don't know how to solve this. Why does this error occur? Any solution or workaround would be highly appreciated. Thank you in advance. I have taken a look at the following examples in order to ...

Missing chunks when creating file with AudioQueue

So a .wav file has a few standard chunks. In most of the files I work with, the "RIFF" chunk is first, then a "fmt " chunk, then the "DATA" chunk. When recording using AVAudioRecorder, those chunks are created (although an extra "FLLR" is created before the "DATA" chunk.) When creating a file with AudioQueue, those standard chunks aren'...

Reading audio buffer data with AudioQueue

I am attempting to read audio data via AudioQueue. When I do so, I can verify that the bit depth of the file is 16-bit. But when I get the actual sample data, I'm only seeing values from -128 to 128. But I'm also seeing suspicious looking interleaved data, which makes me pretty sure that I'm just not reading the data correctly. So to be...

Using audioqueue to stream audio. How to get the length of the audio file before playback ends?

Already finished implementing the player. I want to implement the progress bar. But I wonder if that's possible to do since we are streaming the music. Unless we are provided the length of the song before hand. Please, I need your advice on this. ...

iPhone SDK - How do you make a UIView that responds to audio input?

I am developing an app that uses Core Graphics and Audio Queue Services to generate an image that changes with the input from the iPhone's mic.  Some sort of interaction between Core Graphics and Audio Queue Services is killing my frame rate, keeping it at about 10 fps no matter what I do.  Profiling with Instruments tells me that my pro...

AudioQueueOfflineRender returning empty data

I'm having problems using AudioQueueOfflineRender to decode AAC data. When I examine the buffer after the call, it is always filled with empty data. I made sure the input buffer is valid and packet descriptions are provided. I searched and found that a few others have had the same problem: http://lists.apple.com/archives/Coreaudio-api/...

Building better positional audio [AudioQueue manipulation]

Hey folks, I'm building an app that has a requirement for really accurate positional audio, down to the level of modelling inter-aural time difference (ITD), the slight delay difference between stereo channels that varies with a sound's position relative to a listener. Unfortunately, the iPhone's implementation of OpenAL doesn't have th...

How to lower sound on the iphone's sdk Audioqueue ?

Hi, I'm using Aran Mulhollan' RemoteIOPlayer, using audioqueues in the SDK iphone. I can without problems: - adding two signals to mix sounds - increasing sound volume by multiplying the UInt32 I get from the wav files BUT every other operation gives me warped and distorted sound, and in particular I can't divide the signal. I can't...

How to play next file using Audio Queue Services

What is the right way to play next file using Audio Queue Services? When "play next" button is pressed should I first call AudioQueueStop and then AudioQueuePrime/AudioQueueStart or it is enough to just fill buffers with next file data? The problem is that the latter gives me sound glitches on iPhone. ...

AudioQueue +kAudioQueueProperty_CurrentLevelMeterDB

Hi I am using AudioQueue to record a sound file.In which AudioQueueLevelMeterState have mPeakpower and mAveragePower.In which unit (Db or Dbfs) it return value if i used (AudioQueuePropertyID) kAudioQueueProperty_CurrentLevelMeterDB. ...

Audio Queue Services on iPhone only playing the first enqueued buffer?

Hey all, I've been up all night trying to figure this one out. My code is basically the same as Apple's example here. However, the device plays only the FIRST buffer placed in the queue (I hear the contents of the first buffer when the app starts). After that, no sound will come from the device at all. The playback is still running...

AudioQueueGetProperty returns out-of-range value for kAudioQueueProperty_CurrentLevelMeter property

I'm writing an application that requires the user be in a quiet environment. To do this, I periodically check the power reading off the microphone. (I'm aware of the returned value being in dBFS or, in this case, a float in the interval [0, 1]. ) My problem is that the below code works just fine... except when it returns 184660647322837...

Manipulating AudioQueueBuffer audio data results in noise.

I try to implement an iPhone sound processing app using SpeakHere sample app as a starting point. The app involves manipulating buffered audio samples during playback. When I multiply the samples by a fractional number (0.9 for instance) I get noise as a result. The strangest thing about it is that when multiplying samples by whole numbe...

Audio Queue: Timestamps for recorded buffers

I am trying to grab times out of recorded buffers in my AudioInputCallback function for a recording queue. Unfortunately the timestamps I'm seeing aren't as expected. Here's an example (using AudioTimeStamp.mHostTime): 2010-01-21 14:03:35.252 [61694:207] 1288747268011206 1288747396166138 -128154932 2010-01-21 14:03:35.344 [61694:2...

After interruption, delayed audio route change notifications when recording

My iPhone application requires that I know when a user has/has not plugged in her headphones. That's easy. AudioSessionAddPropertyListener with a callback listening to kAudioSessionProperty_AudioRouteChange. I write logs with NSLog as things happen. User plugs the headphones in? Get a notification, and a line in the gdb console. User un...

Objective-c - How to serialize audio file into small packets that can be played?

Hi there, So, I would like to get a sound file and convert it in packets, and send it to another computer. I would like that the other computer be able to play the packets as they arrive. I am using AVAudioPlayer to try to play this packets, but I couldn't find a proper way to serialize the data on the peer1 that the peer2 can play. T...