views:

268

answers:

2

Hi all,

I am using AVFramework to capture camera frames and I would like to process and display them in a UIImageView but am having some trouble. I have the code:

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{ 
    //NSLog(@"Capturing\n");
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];     

    NSLog(@"Image: %f %f\n", image.size.height, image.size.width);
    [imageView setImage:image];
}

However, it won't display. The correct size shows up in the NSLog, and when I put:

[imageView setImage:[UIImage imageNamed:@"SomethingElse.png"]]; 

in viewDidLoad, an image is displayed correctly (so I know the UIImageView is connected correctly).

Is there any reason this shouldn't work??? I am at a loss right now.

Cheers, Brett

A: 

Is imageView set correctly? If imageView is actually nil, your call to [imageView setImage:image]; will silently do nothing.

Nicholas M T Elliott
Yup, when I NSLog(@"%@", imageView); I can see the address, CALayer address, etc. So looks like it is not nil.
Brett
More specifically: the NSLog of imageView looks like: <UIImageView: 0x130c50; frame = (0 0; 320 480); opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x12f730>>
Brett
+2  A: 

Are you doing this on the main thread? You can either use the main queue for the delegate (undesirable, because you're doing processing first) or:

dispatch_async(dispatch_get_main_queue(), ^{
    imageView.image = ...;
});
jtbandes
thank you, your response saved me what'd probably be a long time! I was capturing output on my own queue.
denrk