views:

76

answers:

3

Maybe you've noticed but Google Image search now has a feature where you can narrow results by color. Does anyone know how they do this? Obviously, they've indexed information about each image.

I am curious what the best methods of analyzing an image's color data to allow simple color searching.

Thanks for any and all ideas!

A: 

Average color of all pixels? Make a histogram and find the average of the 'n' peaks?

Jim Buck
+1  A: 

Averaging the colours is a great start. Just downscale your image to 10% of the original size using a Bicubic or Bilinear filter (or something advanced anyway). This will vastly reduce the colour noise and give you a result which is closer to how humans perceive the image. I.e. a pixel-raster consisting purely of yellow and blue pixels would become clean green.

If you don't blur or downsize the image, you might still end up with an average of green, but the deviation would be huge.

David Rutten
Would I then take an average color of the scaled down image and use that for indexing purposes?
Joel Verhagen
You sample the colours of the downsized image.
David Rutten
A: 

The Google feature offers 12 colors with which to match images. So I would calculate the Lab coordinate of each of these swatches and plot the L*a* coordinate of each of these colors on a two dimensional space. I'd drop the B component because brightness of the pixel should be ignored. Using the 12 points in the L*a* space, I'd calculate a partitioning using a Voronoi Diagram. Then for a given image, I'd take each pixel, calculate its L*a* coordinate. Do this for every pixel in the image and so build up the histogram of counts in each Voronoi partition. The partition that contains the highest pixel count would then be considered the image's 'color'.

This would form the basis of the algorithm, although there would be refinements related to ignoring black and white background regions which are perceptually not considered to be part of the subject of the image.

Phillip Ngan