Hopefully I can describe my situation clearly as whenever I try to describe it to others on the whiteboard or pen and paper it takes a few attempts :)
Part of my program involves a cart that rides along a rail and the cart runs on two wheels. The rails is a simple line which hills up and down in curves and also flattens out. To create the level I use an NSMutableArray of CGPoints and draw them to the screen.
My current collision detection involves using each wheels x-cordinate and comparing it to the array of points x values. When it's between two points x-values I can use the gradient to discover whether or not it's below/under the y-value at that point. This works fine when I use definite values of X.
So here comes the problem.
First, I have a front wheel which rolls along this line perfectly with a correct x and correct y position. The problem I'm having is creating a back wheel that follows this front wheel. The back wheel also follows the line perfectly however it's always a set X distance behind the front wheel, this proves to be unrealistic as when you go up or down steep hills the distance between the wheels is too great due to the difference in Y.
Example:
O.......O
Hill Example:
O
........O
Same distance in x, difference in Y so the hypotenuse distance is greater and gives the impression of the wheels spreading apart. I need to constrain the back wheel to the front wheel so it's true(hypotenuse) distance is always the same. The difficulty I'm having is that since this involves changing it's x-position I can't get a handle on it's collision detection.
If I start to go uphill, I have to change it's y velocity to drop behind the front wheel, so then I need to change it's x-value to stay the same true distance behind the wheel, now it's y-value will be wrong since it ends up in the floor.
Wracked my brains and the offices for a few days now, any help please?