I'm prototyping a client that displays streaming video from a HaiVision Barracuda through a quicktime client. I've been unable to reduce the buffer size below 3.0 seconds... for this application, we need as low a latency as the network allows, and prefer video dropouts to delay. I'm doing the following:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
NSString *path = [[NSBundle mainBundle] pathForResource:@"haivision" ofType:@"sdp"];
NSError *error = nil;
QTMovie *qtmovie = [QTMovie movieWithFile:path error:&error];
if( error != nil ) {
NSLog(@"error: %@", [error localizedDescription]);
}
Movie movie = [qtmovie quickTimeMovie];
long trackCount = GetMovieTrackCount(movie);
Track theTrack = GetMovieTrack(movie,1);
Media theMedia = GetTrackMedia(theTrack);
MediaHandler theMediaHandler = GetMediaHandler(theMedia);
QTSMediaPresentationParams myPres;
ComponentResult c = QTSMediaGetIndStreamInfo(theMediaHandler, 1,kQTSMediaPresentationInfo,
&myPres);
Fixed shortdelay = 1<<15;
OSErr theErr = QTSPresSetInfo (myPres.presentationID,
kQTSAllStreams,
kQTSTargetBufferDurationInfo,
&shortdelay );
NSLog(@"OSErr %d", theErr);
[movieView setMovie:qtmovie];
[movieView play:self];
}
I seem to be getting valid objects/structures all the way down to the QTSPres, though the ComponentResult and OSErr are both returning -50. The streaming video plays fine, but the buffer is still 3.0seconds. Any help/insight appreciated.
J