views:

75

answers:

2

I have a view hierarchy which contains smaller views on a scroll view. Each view can have subviews in it such as buttons etc.

For some reason, buttons on the view aren't clicked; exploring this further showed that while the scroll view receives the touchBegan event, the button does not. Calling the hitTest:event: message shows that the button is not returned, even though it is within the limits.

I've included a log output describing the touch's location on the scroll view, the item returned from hitTest, the touch's location if I called locationInView: using the expected item, and the hierarchy of the expected item (with frames printed). From this output I can deduce that the button should have been called...

Can anyone explain this? Am I missing something?

touched ({451, 309}) on <VCViewContainersView: 0x4b31ee0; frame = (0 0; 748 1024); transform = [0, 1, -1, 0, 0, 0]; autoresize = W+H; layer = <CALayer: 0x4b32130>> (location in expected item: {17, 7.5})
expected touched item is:
view: <UIButtonLabel: 0x482b920; frame = (32 5; 36 19); text = 'Click'; clipsToBounds = YES; opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x4831370>>, layer transform: [1, 0, 0, 1, 0, 0]
 view: <UIRoundedRectButton: 0x482c100; frame = (50 50; 100 30); opaque = NO; layer = <CALayer: 0x482c450>>, layer transform: [1, 0, 0, 1, 0, 0]
  view: <UIImageView: 0x480f290; frame = (0 0; 320 255); opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x480e840>>, layer transform: [1, 0, 0, 1, 0, 0]
   view: <VCViewContainer: 0x4b333c0; frame = (352 246.5; 320 471.75); layer = <CALayer: 0x4b33d50>>, layer transform: [1, 0, 0, 1, 0, 0]
    view: <UIScrollView: 0x4b32600; frame = (0 0; 1024 748); clipsToBounds = YES; autoresize = W+H; userInteractionEnabled = NO; layer = <CALayer: 0x4b32780>>, layer transform: [1, 0, 0, 1, 0, 0]
     view: <VCViewsContainerView: 0x4b31ee0; frame = (0 0; 748 1024); transform = [0, 1, -1, 0, 0, 0]; autoresize = W+H; layer = <CALayer: 0x4b32130>>, layer transform: [0, 1, -1, 0, 0, 0]
      view: <UIWindow: 0x4b1d590; frame = (0 0; 768 1024); opaque = NO; autoresize = RM+BM; layer = <CALayer: 0x4b1d6d0>>, layer transform: [1, 0, 0, 1, 0, 0]

Update: Other than the UIWindow and VCViewsContainerView, all views are created programmatically using initWithFrame: or in the case of the button, buttonWithType:. The VCViewContainer is initialized using CGRectZero and when the UIImageView is created, its frame is set to the image's size + additional space for labels on the bottom of it.

Update 2: When calling [self.layer hitTest:location] with the same location, I get the layer of the correct view! What's going on here...?

A: 

If I understand view stack correctly then your button is subview of some UIImageView - it has userInteractionEnabled property set to NO (by default) - so the image view and all its subviews won't receive any touch events.

Setting image view's userInteractionEnabled property to YES must solve the problem

Vladimir
Just in case I tried that - but unfortunately that isn't the case. I don't think what you said is true; The docs specify that the search for an event handler is top to bottom, therefore the button should be tested first. And in any case, hitTest: is not effected by user interaction at all..
Aviad Ben Dov
A: 

hitTest:withEvent: starts at the window. Each view tests its subviews before testing itself, and so on, recursively. If a view's userInteractionEnabled is NO, however, it returns nil from hitTest:withEvent:, without testing its subviews. Such a view is certainly hit-tested, but it immediately replies that neither it nor any of its subviews is the hit view.

You UIScrollView has its userInteractinonEnabled set to NO. Thus, when the VCViewContainersView tests its subview the UIScrollView, and the UIScrollView returns nil because its userInteractionEnabled is NO, the VCViewContainersView uses pointInside:withEvent: on itself, finds that the touch is within itself, and returns itself as the hit view (and the search ends). This explains the result you are getting.

The reason this doesn't happen when you do the hit-test by way of the layers is that layers are not touchable and know nothing about the rules for touches, so they ignore userInteractionEnabled, which is a view feature, not a layer feature. Layer hit-testing is sort of a kludge, intended only for when a view contains a whole layer hierarchy (without a view hierarchy) and you want to simulate that a particular layer is touchable. The docs do tell you that the logic is different for a layer's hitTest: than it is for a view's hitTest:withEvent:, though they fail to explain exactly how.

I don't know why you have set your scroll view to be non-touchable, but if that's important to you, you can override hitTest:withEvent: in a UIScrollView subclass so that it tests its subviews but returns nil if all of them return nil.

matt