I have a collection of objects which describe an image-name, its size and it's X/Y location. The collection is sorted by "layers", so I can composite the images in a sort of painter's algorithm.
From this, I can determine the rectangle necessary to hold all of the images, so now what I want to do is:
- Create some sort of buffer to hold the result (The NS equivalent of what iPhoneOS calls UIGraphicsContext.)
- Draw all the images into the buffer.
- Snag a new NSImage out of the composited result of the buffer.
In iPhoneOS, this is the code that does what I want:
UIGraphicsBeginImageContext (woSize);
CGContextRef ctx = UIGraphicsGetCurrentContext();
[[UIColor clearColor] set];
CGContextFillRect(ctx, NSMakeRect(0, 0, woSize.width, woSize.height));
// draw my various images, here.
// i.e. Various repetitions of [myImage drawAtPoint:somePoint];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
What I'm looking for is how to do that in Desktop Cocoa/NS.
Thanks!