views:

1901

answers:

6

I have a sublayer on a layer-backed view. The sublayer's contents are set to an Image Ref and is a 25x25 rect.
I perform a hit test on the super layer when the touchesBegan and touchesMoved methods are invoked. The hit test method does, in fact, return the sublayer if it is touched, BUT only if the bottom half of the image is touched. If the top half of the image is touched, it returns the superlayer instead.

I do know the iPhone OS compensates for the tendancy of user touches to be lower than intended. Even if I resize the sublayer to a larger size, 50x50, it exhibits the same behavior.

Any thoughts?

A: 

It seems that the layer is not just receiving touches on the bottom pixels. Instead it seems that the "actual" on screen layer and the contents of the layer are defined by different CGRects. The Image is displayed at the expected cordinates, while the layer that responds to touches is offset below the image. By below, I mean the origin of the image is at (200, 200) and the origin of the layer that responds to touches is at (200, 220).

Below I am posting some test code I used to recreate the problem. First my view subclass and then my view controller. Any reasoning behind this problem is greatly appreciated.

My View subclass:

#import "myView.h"
#import <QuartzCore/QuartzCore.h>


@implementation myView


- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {

    CGPoint currentLocation = [[touches anyObject] locationInView:self];

    CALayer *currentLayer = [self.layer hitTest:currentLocation];

    CGPoint center = [currentLayer position];

    NSLog([currentLayer valueForKey:@"isRootLayer"]);

    if(![currentLayer valueForKey:@"isRootLayer"]) {

     if (center.x != 200) {

      [currentLayer setPosition:CGPointMake(200.0f, 200.0f)];  

     } else {
      [currentLayer setPosition:CGPointMake(100.0f, 100.0f)];  

     }
    }
}


- (id)initWithFrame:(CGRect)frame {
    if (self = [super initWithFrame:frame]) {
        // Initialization code
    }
    return self;
}


- (void)drawRect:(CGRect)rect {
    // Drawing code
}


- (void)dealloc {
    [super dealloc];
}


@end

My View Controller:

#import "layerTestViewController.h"
#import <QuartzCore/QuartzCore.h>


#define ELEMENT_PERIOD_SIZE 50
#define ELEMENT_GROUP_SIZE 50

@implementation layerTestViewController





// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad {
    [super viewDidLoad];

    CALayer  *myLayer = [CALayer layer];
    CGRect layerFrame =CGRectMake(0.0f, 0.0f, ELEMENT_GROUP_SIZE, ELEMENT_PERIOD_SIZE);
    myLayer.frame = layerFrame; 

    [myLayer setName:[NSString stringWithString:@"test"]];
    [myLayer setValue:[NSString stringWithString:@"testkey"] forKey:@"key"];

    UIGraphicsBeginImageContext(layerFrame.size);
    [[UIColor blueColor] set];
    UIRectFill(layerFrame);
    UIImage *theImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    myLayer.contents = (id)[theImage CGImage];
    myLayer.position = CGPointMake(100.0f, 100.0f);

    [self.view.layer addSublayer:myLayer];  

    [self.view.layer setValue:[NSString stringWithString:@"YES"] forKey:@"isRootLayer"];
    NSLog([self.view.layer valueForKey:@"isRootLayer"]);

}




- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning]; // Releases the view if it doesn't have a superview
    // Release anything that's not essential, such as cached data
}


- (void)dealloc {
    [super dealloc];
}

@end
Corey Floyd
A: 

If touches are being recognized on the lower part only, one possibility is that another subview is covering the top half of this subview. Do you have multiple subviews or only one image? In case you have multiple subviews and any of them have the background color as clearColor, try giving a solid background color to it for testing. That ways you'll know if your subview is getting covered by another subview or not.

Hope that helps.

lostInTransit
A: 

I have simplified my code to find the problem. I only have one sublayer on screen and no subviews. The code listed above is what I am running in the simulator. Since there are only 2 layers on screen, the backing layer of the host view and the sublayer I have added, nothing should interfere with the hit test.

In case it wasn't clear, when the top portion of the sublayer is touched, the hit test return the backing layer instead. Also if I touch the backing layer at a point just below the sublayer, the sublayer is returned by the hit test.

It is hard to explain but if you run the code, it becomes clear.

Thanks

Corey Floyd
A: 

When I try your code, I get almost-correct behavior, except touches are often not recognized on the very edge of the sublayer. I am a bit confused as to what's going on. You might want to try doing something like when you register a touch, throw a new tiny layer up where the touch occurred (like a 5x5 square of some color) and then remove it again 3 seconds later. That way you can track where CA thinks the touch occurred versus where your cursor actually is.

BTW, you don't need to create a CGImage for your blue content, you can just set the layer's backgroundColor to [UIColor blueColor].CGColor.

Kevin Ballard
A: 

Thanks Kevin...

I ended up just redoing my code with UIViews. It was too much trouble just trying to figure out why the UIView didn't know what layer was being touched. My whole reasoning behind the idea was that I will have a 100+ items on screen at one time, and thought Layers would be easier on the processor than UIViews. I have the app up and running now and 100 views isn't giving it any hiccups.

My suggestion to others is just stick with hit testing UIViews when possible, it makes the code much simpler

Corey Floyd
+4  A: 

The documentation for hitTest says:

/* Returns the farthest descendant of the layer containing point 'p'.
 * Siblings are searched in top-to-bottom order. 'p' is in the
 * coordinate system of the receiver's superlayer. */

So you need to do (something like this):

CGPoint thePoint = [touch locationInView:self];
thePoint = [self.layer convertPoint:thePoint toLayer:self.layer.superlayer];
CALayer *theLayer = [self.layer hitTest:thePoint];

(Repeating answer from other post for completeness' sake)

schwa
Thanks! I had the same problem, and it was driving me crazy. I'm still a bit confused as to why it's necessary, but it seems to be working.
Mark Bessey
I think one issue is that CA view coordinates are flipped from UIView coordinates (bottom left is origin)? But I would love to hear from someone who can elaborate on the view and layer hierarchy and really explain what else is going on.
Pat Niemeyer