views:

464

answers:

4

For a image-upload tool I want to detect the (subjective) quality of an image automatically, resulting in a rating of the quality.

I have the following idea to realize this heuristically:

  • Obviously incorporate the resolution into the rating.
  • Compress it to JPG (75%), decompress it and compare jpg-size vs. decompressed size to gain a ratio. The blurrier the image is, the higher the ratio.

Obviously my approach would use up a lot of cycles and memory if large images are rated, although this would do in my scenario (fat server, not many uploads), and I could always build in a "short circuit" around the more expensive steps if the image exceeds a certain resolution.

Is there something else I can try, or is there a way to do this more efficiently?

+4  A: 

Assesing the image (the same goes for sound or video) quality is not an easy task, and there are numerous publications tackling the problem.

Much depends on the nature of the image - different set of criteria is appropriate for artificially created images (i.e. diagrams) or natural images (i.e. photographs). There are subtle effects that have to be taken into consideration - like color masking, luminance masking, contrast perception. For some images a given compression ratio is perfectly adequate, while for other it will result in significant loss of quality.

Here is a free-access publication giving a brief introduction to the subject of image quality evaluation.

The method you mentioned - compressing the image and comparing the result with the original is far from perfect. What will be the metric that you plan to use? MSE? MSE per block? For sure it is not too difficult to implement, but the results will be difficult to interpret (consider images with high-frequency components and without them).

And if you want to delve more into the are of image quality assessment there is also a lot of research done by the machine learning community.

Anonymous
+2  A: 

You could try looking in the EXIF tags of the image (using something like exiftool), what you get will vary a lot. On my SLR, for example, you even get which of the focus points were active when the image was taken. There may also be something about compression quality.

The other thing to check is the image histogram - watch out for images biased to the left, which suggests under-exposure or lots of saturated pixels.

For image blur you could look at the high frequency components of the Fourier transform, this is probably accessing parameters relating to the JPG compression anyway.

This is a bit of a tricky area because most "rules" you might be able to implement could arguably be broken for artistic effect.

Ian Hopkinson
+2  A: 

I'd like to shoot down the "obviously incorporate resolution" idea. Resolution tells you nothing. I can scale an image by a factor of 2 , quadrupling the number of pixels. This adds no information whatsoever, nor does it improve quality.

I am not sure about the "compress to JPG" idea. JPG is a photo-oriented algorithm. Not all images are photos. Besides, a blue sky compresses quite well. Uniformly grey even better. Do you think exact cloud types determine the image quality?

Sharpness is a bad idea, for similar reasons. Depth of Field is not trivially related to image quality. Items photographed against a black background will have a lot of pixels with quite low intensity, intentionally. Again, this does not signal underexposure, so the histogram isn't a good quality indicator by itself either.

MSalters
A: 

But what if the photos are "commercial?" Does the value of the existing technology work if the photos are of every-day objects and purposefully non-artistic?

If I hire hundreds of people to take pictures of park benches I want to quickly know which pictures are of better quality (in-focus, well-lit) and which aren't. I don't want pictures of kittens, people, sunsets, etc.

Or what if the pictures are supposed to be of items for a catalog? No models, just garments. Would image-quality processing help there?

tggagne