Hi All,
I am developing an application for the Android platform which contains 1000+ image filters that have been 'evolved'.
When a user selects a photo I want to present the most relevant filters first.
This 'relevance' should be dependent on previous use cases.
I have already developed tools that register when a filtered image is saved; this combination of filter and image can be seen as the training data for my system.
The issue is that the comparison must occur between selecting an image and the next screen coming up. From a UI point of view I need the whole process to take less that 4 seconds; select an image-> obtain a metric to use for similarity -> check against use cases -> return 6 closest matches. I figure with 4 seconds I can use animations and progress dialogs to keep the user happy.
Due to platform contraints I am fairly limited in the computational expense of the algorithm. I have implemented a technique adapted from various online tutorials for running C code on the G1 and hence this language is available
Specific Constraints;
- Qualcomm® MSM7201A™, 528 MHz Processor
- 320 x 480 Pixel bitmap in 32 bit ARGB
- ~ 2 seconds computational time for the native method to get the metric
- ~ 2 seconds to compare the metric of the current image with training data
This is an academic project so all ideas are welcome, anything you can think of or have heard about would be of interest to me.
My ideas;
- I want to keep the complexity down (O(n*m)?) by using pixel data only rather than a neighbourhood function
- I was looking at using the Colour historgram/Greyscale histogram/Texture/Entropy of the image, combining them to make the measure.
- There will be an obvious loss of information but I need the resultant metric to be substantially smaller than the memory footprint of the image (~0.512 MB)
As I said, any ideas to direct my research would be fantastic.
Kind regards,
Gavin