Hi all,
I have a UIView that contains a number of CALayer subclasses. I am using the following code to detect which layer a touch event corresponds to:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
NSLog(@"%@,%@",NSStringFromCGPoint(point),[self.layer hitTest:point].name);
}
This works fine until the device is rotated. When the device is rotated all current layers are removed from the superlayer, and new CALayers are created to fit the new orientation. The new layers are correctly inserted and viewable in the correct orientation.
After the rotation the hitTest method consistently returns null for the layer. I have noticed that when rotated 180 degrees, the returned layer is what was in that location before the rotation, i.e. touching the top left layer gives the layer in the bottom right when rotated 180 degrees. The coordinates of the hit test are printed as expected with (0,0) being in the top left. I redraw the layers with every rotation, but for some reason they seem to be mapped to being the "correct" way up, with the home button at the bottom. Am I missing a function call or something after handling the rotation?
Cheers, Adam