views:

127

answers:

2

Hello :-)

I have a view in which I paint a dotted line when the user moves their finger across the screen of the iPad/iPhone. I draw the line between a point stored as LastLocation, and the point that is the CurrentLocation. The drawings are to be persistant on the screen. This happens every time the touchMoved event is fired, and ends up letting me draw a dotted line tracing where the person has dragged their finger.... just like a painting application.

I have a function that gets called that does the following when the touch event is fired. The view contains a UIImageView called drawImage. I use UIImageView as a means of persisting the lines drawn. This clearly isnt how people usually do paint applications, as it is quite slow. Any insight in to a better way to do persistant painting using the CGContextRef calls would be appreciated.

/* Enter bitmap graphics drawing context */
UIGraphicsBeginImageContext(frame.size);
/* Draw the previous frame back in to the bitmap graphics context */
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)]; //originally self.frame.size.width, self.frame.size.height)];       
/* Draw the new line */
CGContextRef ctx = UIGraphicsGetCurrentContext();
/* Set line draw color to green, rounded cap end and width of 5 */
CGContextSetLineDash(ctx, 0, dashPattern, 2);
CGContextSetStrokeColorWithColor(ctx, color);
CGContextSetLineCap(ctx, kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
CGContextSetLineWidth(ctx, 5.0); // for size
/* Start the new path point */
CGContextMoveToPoint(ctx, LastLocation.x, LastLocation.y);
CGContextAddLineToPoint(ctx, Current.x, Current.y);
CGContextStrokePath(ctx);
/* Push the updated graphics back to the image */
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

The 'drawImage.image drawInRect' call is extremely slow, and is in essence redrawing the entire image.

Is there a faster way of doing this? I have seen drawing code like this in a few places on blogs for painting, but it's just a bit slow. Would love to hear some thoughts on the topic.

+1  A: 

There is no need to composite the image and line manually. Have the view that draws the line above another UIImageView that draws the image. Let the system do the compositing and draw the image.

In your code, just do the stuff between the two drawImage lines in the drawRect: method of the line drawing view.

-(void) drawRect:(CGRect)dirty {
    CGContextRef ctx = UIGraphicsGetCurrentContext();
    /* Set line draw color to green, rounded cap end and width of 5 */
    CGContextSetLineDash(ctx, 0, dashPattern, 2);
    CGContextSetStrokeColorWithColor(ctx, color);
    CGContextSetLineCap(ctx, kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
    CGContextSetLineWidth(ctx, 5.0); // for size
    /* Start the new path point */
    CGContextMoveToPoint(ctx, LastLocation.x, LastLocation.y);
    CGContextAddLineToPoint(ctx, Current.x, Current.y);
    CGContextStrokePath(ctx);
}

When one end of the line moves, save the new point and mark the line drawing view as needing to display. So both Current and LastLocation should be members of the line drawing view, and in the setter method for each, call setNeedsDisplay.

Make sure clearsContextBeforeDrawing is YES and opaque is NO for the line drawing view.

drawnonward
drawnonward, I've had a think about your solution and I'm not sure that I explained my self properly, so i will update the question.The UIImageView is only there is keep the lines that the user has drawn on the screen, because the context always get refreshed. The effect I am actually going for is like a "paint brush" style movement with the users finger, and all of the painting stays persistent. I was using the UIImageView to maintain that persistence. Does that make sense? :-)
Fuzz
Your current code creates a new CGBitmapContext with UIGraphicsBeginImageContext at every call, then copies the contents of that context into a new image with UIGraphicsGetImageFromCurrentImageContext. That is a lot of unnecessary data being copied around. Instead, you should create one persistent CGBitmapContext that you draw into, and one CGImage that references the same pixels. Then draw that image in drawRect: or stick it in a UIImageView. Whenever the bitmap changes, call setNeedsDisplay on the view.
drawnonward
ahhh, awesome. Makes sense.I've spent some time trying to work it out, but how in the world do I create a CGImage with the same memory as a CGBitmapContext. I have created the context, and have a void* big enough for my data... but I'm not sure how to make the CGImage. Using CGBitmapContextCreate actually does a copy, so that doesnt help :-(
Fuzz
ahhh, i found a friend in CGDataProviderCreateWithData, and passing that to CGImageCreate.
Fuzz
Yes, CGDataProviderCreateWithData is the missing link between CGImage and CGBitmapContext. Glad you got it.
drawnonward
A: 

Fuzz, Did you get this working? I am trying to figure out a very similar problem. I would love to hear what you came up with. Thanks - fxshot

Mark A.