tags:

views:

747

answers:

1

Hi, I m the beginner in iphone software development. I developing the application on skin cancer in which i want to calculate or count red color pixel from UIImage which is captured by iphone camera.It is possible to count red pixel from UIImage?

+1  A: 

Since this is a question that is asked almost weekly, I decided to make a little example project that shows how to do this. You can look at the code at:

http://github.com/st3fan/iphone-experiments/tree/master/Miscellaneous/PixelAccess/

The important bit is the following code, which takes a UIImage and then counts the number of pure red pixels. It is an example and you can use it and modify it for your own algorithms:

/**
 * Structure to keep one pixel in RRRRRRRRGGGGGGGGBBBBBBBBAAAAAAAA format
 */

struct pixel {
    unsigned char r, g, b, a;
};

/**
 * Process the image and return the number of pure red pixels in it.
 */

- (NSUInteger) processImage: (UIImage*) image
{
    NSUInteger numberOfRedPixels = 0;

    // Allocate a buffer big enough to hold all the pixels

    struct pixel* pixels = (struct pixel*) calloc(1, image.size.width * image.size.height * sizeof(struct pixel));
    if (pixels != nil)
    {
        // Create a new bitmap

        CGContextRef context = CGBitmapContextCreate(
            (void*) pixels,
            image.size.width,
            image.size.height,
            8,
            image.size.width * 4,
            CGImageGetColorSpace(image.CGImage),
            kCGImageAlphaPremultipliedLast
        );

        if (context != NULL)
        {
            // Draw the image in the bitmap

            CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, image.size.width, image.size.height), image.CGImage);

            // Now that we have the image drawn in our own buffer, we can loop over the pixels to
            // process it. This simple case simply counts all pixels that have a pure red component.

            // There are probably more efficient and interesting ways to do this. But the important
            // part is that the pixels buffer can be read directly.

            NSUInteger numberOfPixels = image.size.width * image.size.height;

            while (numberOfPixels > 0) {
                if (pixels->r == 255) {
                    numberOfRedPixels++;
                }
                pixels++;
                numberOfPixels--;
            }

            CGContextRelease(context);
        }

        free(pixels);
    }

    return numberOfRedPixels;
}

A simple example on how to call this:

- (IBAction) processImage
{
    NSUInteger numberOfRedPixels = [self processImage: [UIImage imageNamed: @"DutchFlag.png"]];
    label_.text = [NSString stringWithFormat: @"There are %d red pixels in the image", numberOfRedPixels];
}

The example project on Github contains a complete working example.

St3fan
Hi St3fan,The above code giving me only constant red pixels(8000) value even if i changed the photo.I changing many images but it giving me same count value.
RRB
Then you are doing something wrong. The code works fine.
St3fan
hi St3fan, from above code i changing the image DutchFlag.png instead of that my own .png image but i getting error(Iphone simulator goes to debugger).
RRB
Hi, the above code is also supporting with .jpg or other extensions?
RRB
Yes it works with all image types that UIImage supports.
St3fan
Hi St3fan,I going to make application in which i want to make process camera captured .jpg skin image. Can it work fine with skin image pixels intensities?
RRB
Rajendra, the code is just an example on how to access the pixels. You will have to write the correct filters and algorithms for your specific project.
St3fan
St3fan, which steps should i have to be follow for RGB color counting from camera captured image from iphone?
RRB
Rajendra, maybe you should read a book about image processing.
St3fan