avfoundation

IPhone avcomposition issue

Hi, Im trying to create a video that shows two videos one after the other using avcomposition on the iphone. This code works, however i can only see one of the videos for the entire duration of the newly created video - (void) startEdit{ AVMutableComposition* mixComposition = [AVMutableComposition composition]; NSString* a_inputFile...

Cannot add category to AVAudioPlayer

I'm trying to add a category to AVAudioPlayer but I keep getting the error "Cannot find interface declaration for 'AVAudioPlayer'". AVAudioPlayer+Fade.h #import <AVFoundation/AVFoundation.h> @interface AVAudioPlayer (Fade) - (void)fadeToVolume:(CGFloat)volume; @end AVAudioPlayer+Fade.m @implementation AVAudioPlayer (Fade) - (voi...

What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer?

I would like to stream video from an iPhone camera to an app running on a Mac. Think sorta like video chat but only one way, from the device to a receiver app (and it's not video chat). My basic understanding so far: You can use AVFoundation to get 'live' video camera data without saving to a file but it is uncompressed data and thus...

AVAssetExportSession - AVMutableCompositionTrack - exported output file is empty

hello, I try to add a .mp3-file to an AVMutableCompositionTrack and after that I want to export the new file. The Problem is: The generated file exists after exporting but it is empty and can not be played. Does someone see the error in my code? AVMutableComposition *saveComposition = [AVMutableComposition composition]; NSArray *docpa...

AVMutableCompositionTrack - insertTimeRange - insertEmptyTimeRange issue

Hello there, I have a strange problem: I want to generate a new sound file out of two soundfiles and silence. sound1: 2 seconds long + silence: 2 seconds silence + sound2: 2 seconds long When I try the code below, I get a 6 seconds long soundfile with all the parts, but in a different order! The order is: sound1, sound2, silence I am...

How to use CVPixelBufferPool in conjunction with AVAssetWriterInputPixelBufferAdaptor in iphone?

Hi Everybody, I have successfully created video from images using the following code -(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size duration:(int)duration { NSError *error = nil; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:pat...

progress of AVAssetWriter

How can I calculate the progress of an AVAssetWriter process? So if I have something like: [assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{ while (1){ if ([assetWriterInput isReadyForMoreMediaData]) { CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer]; if (sampleBuffer) { ...

Front Camera Video Recording iPhone 4??

i am trying to record a video from front camera of iPhone 4 using AVFoundation Framework with the help of WWDC samples i got from iPhone developer program. But i still cant get it to work..the video does not get recorded or mayb saved in my iPhone library...here's the code i am trying to use...it would b really helpful if someone culd h...

Problem in writing metadata to image

Hi i am using AvFoundation to take still image and adding gps info to metadata and saving to photos album using Asset library but gps ingo is not saving at all. here is my code... [self.stillImageTaker captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampl...

Reading samples via AVAssetReader

How do you read samples via AVAssetReader? I've found examples of duplicating or mixing using AVAssetReader, but those loops are always controlled by the AVAssetWriter loop. Is it possible just to create an AVAssetReader and read through it, getting each sample? Thanks. ...