views:

148

answers:

1

I am trying to detect a touch event on a PNG image loaded into a UIImageView. I have everything working fine except that the touch is being tested for the bounding rectangle around the image (as expected). What I would like to do is test if the user has selected part of the visible PNG as opposed to the UIImageView itself.

For example if I have a horseshoe image, I want it to only respond to touches when you select the sides and not the center part where nothing is being drawn. I am kind of at a loss on this one, google reveals a number of people with the same issue but not even a hint towards where to begin looking.

+1  A: 

Two ways:

a) you examine a pixel data of your image to determine if the touched pixel is a transparent pixel. You have to draw your image to an offline buffer to make this possible. Use CGContextDrawImage and CGBitmapContextGetData to get access to pixel data from UIImage.CGImage This Apple's Q&A explains the basic method to access pixel data.

b) you have a polygon representation of the horseshoe and use polygon hit testing to determine if the horseshoe was touched. Google for "point in polygon" for algorithms.

a) is probably less work if you need this just for a few images, but if you have a lot of hit testing (game with a lot of movement) b) might be better.

tequilatango
A) Is a step in the right direction. However it would not carry through the touch event to any image that happens to be laying below it. B) Some of the images will be very complex and may have holes in the centers of the images that would not be conveyed easily with a polygon. A) does seem like the better method here and I will look into it deeper but I have the feeling it will open up more problems then it solves.
Seth Bailey