views:

389

answers:

1

On a iPhone app, I need to send a jpg by mail with a maximum size of 300Ko (I don't no the maximum size mail.app can have, but it's another problem). To do that, I'm trying to decrease quality until obtain an image under 300Ko.

In order to obtain the good value of the quality (compressionLevel) who give me a jpg under 300Ko, I have made the following loop. It's working, but each time time the loop is executed, the memory increase of the size of the original size of my jpg (700Ko) despite the "[tmpImage release];".

float compressionLevel = 1.0f;
int size = 300001;
while (size  > 300000) {
    UIImage *tmpImage =[[UIImage alloc] initWithContentsOfFile:[self fullDocumentsPathForTheFile:@"imageToAnalyse.jpg"]];
    size = [UIImageJPEGRepresentation(tmpImage, compressionLevel) length];
    [tmpImage release];
        //In the following line, the 0.001f decrement is choose just in order test the increase of the memory  
    //compressionLevel = compressionLevel - 0.001f;
    NSLog(@"Compression: %f",compressionLevel);
} 

Any ideas about how can i get it off, or why it happens? thanks

+2  A: 

At the very least, there's no point in allocating and releasing the image on every trip through the loop. It shouldn't leak memory, but it's unnecessary, so move the alloc/init and release out of the loop.

Also, the data returned by UIImageJPEGRepresentation is auto-released, so it'll hang around until the current release pool drains (when you get back to the main event loop). Consider adding:

NSAutoreleasePool p = [[NSAutoreleasePool alloc] init];

at the top of the loop, and

[p drain] 

at the end. That way you'll not be leaking all of the intermediate memory.

And finally, doing a linear search for the optimal compression setting is probably pretty inefficient. Do a binary search instead.

Mark Bessey
FredM
http://en.wikipedia.org/wiki/Binary_search_algorithmbasically, start at one end of the possible range of compression values. At each step, make the new value equal to the average of the last tested value and the last value where the compressed size was under the limit.
Mark Bessey
For sure,the binary_search is more more efficient. I made a little test:linear_search: Image size:249.764 Ko Takes:180.64s with 81 loopsbinary search: Image size:249.637 Ko Takes:22.94s with 11 loopsThanks again.
FredM