Hi Everyone,
I'm wondering if it's possible to have an instance of an AVCaptureSession and UIImagePicker both accessing the camera simultaneously? I want to create an app that shows an ambient light meter/indicator as an overlay view of a UIImagePicker when the camera is active. I previously implemented this using UIGetScreenImage() but apple is now disallowing use of this private API in favor of AVCaptureSession. In my experimentation, AVCaptureSession seems to become suspended when UIImagePicker displays the camera view. Any ideas? Thanks!