views:

62

answers:

3

I am looking at making an app that uses a camera to measure the amount of light present when an image is taken. Some conditional behavior would take place based on how much light was present - ie display a message saying "Looks like bedtime" if it looks like it is dark.

I understand that this will be a pretty poor measure of the actual amount of light present due to exposure and things like that, but it doesn't need to be super accurate.

I have no experience with image processing, so I don't even know what to ask, or what is feasible. Is this possible? Are there any C libraries for doing something like this?

+1  A: 

First random thought, something like a threshold filter to remove "objects" take whats left from the source image and threshold it again, count white vs black pixels, should give you a VERY rough idea of light vs dark. Thresholding

Joshua Weinberg
+1  A: 

A very rough estimate could be made by converting each colour channel value of each pixel to its intensity, using the known or assumed gamma of the camera. Then just sum the intensities across the whole image.

If you want the level to approximate that observed by a human, you will weight the green channel intensity higher and the blue channel lower (since our eyes are particularly sensitive to green, and insensitive to blue).

caf
A: 

If it needs to be very accurate, you should consider using some regression techniques such as least squares (in particular, partial least squares). With these methods, you have a set of training data (references for several times), and it compares and interpolates to find an approximation of the time of day.

If it doesn't need to be that accurate, you can get a good approximation by simply calculating the average pixel distance from white/black. If the camera has auto-exposure, you can factor that in too.

drharris