Hi all,
I am using AVFramework to capture camera frames and I would like to process and display them in a UIImageView but am having some trouble. I have the code:
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//NSLog(@"Capturing\n");
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
NSLog(@"Image: %f %f\n", image.size.height, image.size.width);
[imageView setImage:image];
}
However, it won't display. The correct size shows up in the NSLog
, and when I put:
[imageView setImage:[UIImage imageNamed:@"SomethingElse.png"]];
in viewDidLoad, an image is displayed correctly (so I know the UIImageView is connected correctly).
Is there any reason this shouldn't work??? I am at a loss right now.
Cheers, Brett