views:

2177

answers:

6

I'm trying to blend a background with a foreground image, where the foreground image is a transparent image with lines on it.

I am trying to do it this way.

UIGraphicsBeginImageContext(CGSizeMake(320, 480));
CGContextRef context = UIGraphicsGetCurrentContext();   

// create rect that fills screen
CGRect bounds = CGRectMake( 0,0, 320, 480);

// This is my bkgnd image
CGContextDrawImage(context, bounds, [UIImage imageNamed:@"bkgnd.jpg"].CGImage);

CGContextSetBlendMode(context, kCGBlendModeSourceIn);

// This is my image to blend in
CGContextDrawImage(context, bounds, [UIImage imageNamed:@"over.png"].CGImage);

UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();

UIImageWriteToSavedPhotosAlbum(outputImage, self, nil, nil);
// clean up drawing environment
//
UIGraphicsEndImageContext();

but does not seem to work.

Any suggestions will be appreciated.

A: 

You can use UIImage's drawInRect: or drawAtPoint: instead of CGContextDrawImage (they draw to the current context). Does using them give you any difference in output?

It may also be helpful to make sure the UIImage* values you are getting back from imageNamed: are valid.

fbrereto
A: 

Can you provide detail in what you mean by "it does not seem to work?" Does it draw only one image or the other image? Draw black? Noise? Crash? Why have you chosen kCGBlendModeSourceIn; what effect are you trying to achieve (there are dozens of ways to blend images)? Do either of your images have alpha already?

I assume what you're trying to do is mix two images such that each has 50% opacity? Use CGContextSetAlpha() for that rather than CGContextSetBlendMode().

Rob Napier
Thank you for your replies.... Both my images have 100% opacity....when i blend both images... my context only draws the second image... in macosx... i use the source over filter to composite both the images.... and the second image which i want to blend in has alpha already.
I believe you want to go back and study the basics of Quartz Drawing. I believe you're having confusion over what it means to draw various layers with their own alpha versus what it means to blend (which is related, but different in how it is implemented). Go to the Quartz 2D Programming Guide; it will teach you the Core Graphics you need so that you can do what you want easily and with good performance, predictibility and flexibility. http://developer.apple.com/documentation/graphicsimaging/conceptual/drawingwithquartz2d/
Rob Napier
+3  A: 
UIImage* bottomImage = [UIImage imageNamed:@"bottom.png"];  
UIImage* topImage    = [UIImageNamed:@"top.png"];
UIImageView* imageView = [[UIImageView alloc] initWithImage:bottomImage];
UIImageView* subView   = [[UIImageView alloc] initWithImage:topImage];
subView.alpha = 0.5;  // Customize the opacity of the top image.
[imageView addSubview:subView];
UIGraphicsBeginImageContext(imageView.frame.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* blendedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[subView release];
[imageView release];

[self doWhateverIWantWith: blendedImage];
Tyler
Thank you for your replies.... but issnt this the same as having 50% opacity... mayb the manner in which i phrased my question is wrong... mayb i shld say i wanna merge 2 UIImages... and the image on top has alpha values.
If there are alpha values within the top image's png file, just leave out the "subView.alpha = 0.5;" line, and it will draw the top image, including it's custom alpha values, on top of the bottom image.
Tyler
Worked great, thanks.
zekel
A: 

HI

Im currently developing an application where i have to add the text over the image at any position in the image(not subview) and the output should be the single image file with the original image and the text embedded in it,any help will be appreciable.

eg : water mark on the image

Thanks sivasankar

siva
A: 

Blending with alpha

UIGraphicsBeginImageContext(area.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextRetain(context);

// mirroring context
CGContextTranslateCTM(context, 0.0, area.size.height);
CGContextScaleCTM(context, 1.0, -1.0);

for (...) {
    CGContextBeginTransparencyLayer(context, nil);
    CGContextSetAlpha( context, alpha );
    CGContextDrawImage(context, area, tempimg.CGImage);
    CGContextEndTransparencyLayer(context);
}

// get created image
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
CGContextRelease(context);
UIGraphicsEndImageContext();
sakrist
A: 

This is what I've done in my app, similar to Tyler's - but without the UIImageView:

UIImage *bottomImage = [UIImage imageNamed:@"bottom.png"];
UIImage *image = [UIImage imageNamed:@"top.png"];

CGSize newSize = CGSizeMake(width, height);
UIGraphicsBeginImageContext( newSize );

// Use existing opacity as is
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
// Apply supplied opacity
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:0.8];

UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();

If the image already has opacity, you do not need to set it (as in bottomImage) otherwise you can set it (as with image).

Eric