views:

88

answers:

3

Exposure values from camera can be acquired when you take picture (without saving it to SavedPhotos). A light meter application on iPhone does this, probably by using some private API.

That application does it on iPhone 3GS only, so I guess it may be somehow related to EXIF data which is populated with this information when the image is created.

This all applies to 3GS.

Has anything changed with iPhone OS 4.0? Is there a regular way to get these values now?

Does anyone have a working code example for taking these camera/photo setting values?

Thank you

+1  A: 

With AVFoundation in iOS 4.0 you can mess with exposure, refer specifically to AVCaptureDevice, here is a link AVCaptureDevice ref. Not sure if its exactly what you want but you can look around AVFoundation and probably find some useful stuff

Daniel
I need to get exposure value the camera has when it takes a picture. Depending on lightness, camera sets this value at the time of taking a photo.
Bojan Milankovic
A: 

** This should be a comment, but I can't since I don't have enough reputation or what not **

I came here wanting to ask the same question.

iOS4 came out with tons of new APIs, promising the world more or less.

So there's Image I/O Framework that gives you access to image metadata (specifically EXIF data, where you can find exposure and lots more). Look at "Creating and Using Image Sources" under the Image I/O Programming Guide.

But you need to give it a URL, or an NSData, so it can't work right off the camera.

There's also the new Asset Library Framework for accessing the photos library and such, but if you save images to it programmatically they end up without the EXIF data, and you would need the users approval every time you access the images anyhow.

I was sure between these two I'd have the answer, but alas no good so far. So much work has been done, and still we can't get metadata for pictures we take. Such a pity...

Oded Ben Dov
A: 

Hi guys, I think I finally found the lead to the real EXIF data. It'll be a while before I have actual code to post, but I figured this should be publicized in the meantime.

Google captureStillImageAsynchronouslyFromConnection. It's a function of AVCaptureStillImageOutput and following is an excerpt from the documentation (long sought for):

imageDataSampleBuffer - The data that was captured. The buffer attachments may contain metadata appropriate to the image data format. For example, a buffer containing JPEG data may carry a kCGImagePropertyExifDictionary as an attachment. See ImageIO/CGImageProperties.h for a list of keys and value types.

For an example of working with AVCaptureStillImageOutput see WWDC 2010 sample code, under AVCam.

Peace, O.

Oded Ben Dov