tags:

views:

325

answers:

3

I have created an NSImage object, and ideally would like to determine how many of each pixels colour it contains. Is this possible?

A: 

Get an NSBitmapImageRep from your NSImage. Then you can get access to the pixels.

NSImage* img = ...;
NSBitmapImageRep* raw_img = [NSBitmapImageRep imageRepWithData:[img TIFFRepresentation]];
NSColor* color = [raw_img colorAtX:0 y:0];
Iamamac
This is a very expensive approach, as `colorAtX:y:` will involve creating an `NSColor` instance for each pixel, as Peter Hosey notes. It is much more efficient to get the raw data buffer and walk through using pointers to calculate the histogram.
gavinb
+2  A: 

I suggest creating your own bitmap context, wrapping it in a graphics context and setting that as the current context, telling the image to draw itself, and then accessing the pixel data behind the bitmap context directly.

This will be more code, but will save you both a trip through a TIFF representation and the creation of thousands or millions of NSColor objects. If you're working with images of any appreciable size, these expenses will add up quickly.

Peter Hosey
A: 

Look for "histogram" in the Core Image documentation.

NSResponder