I've read uiimagepickercontroller uiimage memory and more and other relevant questions, but I can't get my head around one thing, and I wonder if there are people around here with experience on this particular aspect.
In my app I let the user select an image from his library, ultimately resulting in an upload. (An interesting thing here, is that the images in there may originate from 12-megapixel high quality camera's, since e.g. iTunes happily syncs them into the phone.)
For various reasons, I UIImageJPEGRepresentation
the thing right away to a locally stored file.
Without much thought, I hung on to the UIImage
returned by the picker.
In Instruments, on the simulator, I see that the UIImage
returned by the UIImagePickerController
releases the memory. Does this indeed mean it's being backed by the file in the library, which we cannot access?
Does this also mean, that the advice in the answer linked to above, to store the image right away when needing the full resolution, only leads to additional overhead? (at least for the pictures from the library)
Furthermore, an image taken with the camera, may or may not have some internal backing, which I haven't investigated yet (Why can't the simulator just use my iSight?!). If not, it would surely be a memory hog and the first thing to do would be an UIImageJPEGRepresentation
followed by an UIImage imageWithContentsOfFile:
to enable the backing, which would come at the cost of quite a delay.
Anyone any thoughts?