views:

233

answers:

2

I am currently trying to convert a touch from screenspace, to where it is in worldspace for a 2D game I am working on.

The view I am working on has nearly the same coordinate system as the screen, basically if someone touches the x = 345, y = 500 pixel on the screen, it will be the same on the view, although the y will be backwards because opengl uses the lower left corner for 0 instead of the upper left.

The "camera" is at 0,0,0 looking down negative Z. I say "Camera" since I haven't coded one yet. Right now I am just translating every sprites Z to -100.

The pseudo code I have tried thus far( and have been double checking in mathematica ) is this -

// scale the screen point into a value from 0 to 1
point = { screenPoint.x / screenWidth, screenPoint.y / screenHeight, -100, 1 }
// doing a full inverse here, not just swapping rows and cols
out = Inverse(viewProjection) * point
inverseW = 1 / out.w
finalWorldCoord = out * inverseW

The issue is that this is giving me values that are way less than what they should be, and I am not sure why.

This is with OpenGL ES 2.0, on iPhone OS 3.2.

Does anyone know the correct way to do this?

A: 

I think to start you want to get points initially using -[NSView convertPoint:toView:] and feed in your top most view. That will give you absolute screen coordinates.

Otherwise, I suggest setting breakpoints/NSLogs to watch the values transform so you can see where the numbers lose their proper scale.

TechZen
The screen coordinates are correct, the touches are being sent to the only view in the project.I grabbed the matrices and vectors from the debugger, and went through the algorithm in Mathematica, and verified that there isn't an error somewhere.My current guess is that I'm not accounting for Z correctly. The numbers I'm getting after the transform would make sense if I had a small Z value, of let's say -2 or something. But at -100 they don't make sense.
longshot
You should provide some of the input and output so we can get a feel for what your seeing. You should provide an example of what kind of output you would like to see. The math looks very simple so I'm not clear about what is going wrong.
TechZen
A: 

Came up with a solution, that doesn't use an invert of the projection matrix.
A few notes for anyone who finds this by googling - This assumes a view matrix with the identity, at 0,0,0. Since I don't have a camera, this means I just calculate the points on the near and far plane, then go directly to doing a ray plane intersection test. If you have a view matrix, you will need to multiply the points on the near and far plane by the inverse of the view matrix.

-

(void) touchToWorld:(CGPoint*)screenLocation andZCoordForPlane: (GLfloat) zValue
{
    BGAssert([[GameManager sharedInstance] renderer] != nil, @"renderer is nil");
    BGAssert(screenLocation != NULL, @"location is NULL");
    GLint screenWidth = [[[GameManager sharedInstance] renderer] backingWidth];
    BGAssert(screenWidth > 0.0f, @"screen width is <= 0");
    GLint screenHeight = [[[GameManager sharedInstance] renderer] backingHeight];
    BGAssert(screenHeight > 0.0f, @"screen height <= 0");
    GLfloat aspect = [[[GameManager sharedInstance] renderer] aspect];
    BGAssert(aspect > 0.0f, @"aspect ratio is <= 0");
    GLfloat fov = [[[GameManager sharedInstance] renderer] fov];
    BGAssert(fov > 0.0f, @"fov is <= 0");

    GLfloat near = [[[GameManager sharedInstance] renderer] nearplane];
    GLfloat far = [[[GameManager sharedInstance] renderer] farplane];

    // convert to GL coordinates
    GLfloat newX = (screenLocation->x / (screenWidth / 2.0f) - 1) * aspect;
    GLfloat newY = 1.0f - (screenLocation->y / (screenHeight / 2.0f));

    GLfloat fovInRadians = fov * (PI / 180.0f);
    GLfloat ratioX = tanf(fovInRadians / 2.0f) * newX;
    GLfloat ratioY = tanf(fovInRadians / 2.0f) * newY;

    ESVector3 pointOnNearPlane;
    ESVector3 pointOnFarPlane;

    memset(&pointOnNearPlane, 0, sizeof(ESVector3));
    memset(&pointOnFarPlane, 0, sizeof(ESVector3));

    pointOnNearPlane.v[0] = ratioX * near;
    pointOnNearPlane.v[1] = ratioY * near;
    pointOnNearPlane.v[2] = near;
    pointOnNearPlane.v[3] = 1.0f;

    pointOnFarPlane.v[0] = ratioX * far;
    pointOnFarPlane.v[1] = ratioY * far;
    pointOnFarPlane.v[2] = far;
    pointOnFarPlane.v[3] = 1.0f;

    ESVector3 lineBetweenNearAndFarPlane;
    memset(&lineBetweenNearAndFarPlane, 0, sizeof(ESVector3));
    esVec3Sub(&lineBetweenNearAndFarPlane, &pointOnFarPlane, &pointOnNearPlane); 

    // we need to do ray to plane. Point on near plane is the rays origin
    // normalized direction is the rays direction
    ESVector3 normalizedDirection;
    memset(&normalizedDirection, 0, sizeof(ESVector3));
    esVec3Normalize(&normalizedDirection, &lineBetweenNearAndFarPlane);

    ESVector4 plane;
    memset(&plane, 0, sizeof(ESVector4));

    plane.v[0] = 0.0f;
    plane.v[1] = 0.0f;
    plane.v[2] = 1.0f;
    plane.v[3] = zValue;


    GLfloat vd = esVec3Dot((ESVector3*)&plane, &normalizedDirection);
    GLfloat v0 = -(esVec3Dot((ESVector3*)&plane, &pointOnNearPlane) + plane.v[3]);
    GLfloat t = v0 / vd;
    ESVector3 intersectPoint;
    memset(&intersectPoint, 0, sizeof(ESVector3));

    intersectPoint.v[0] = pointOnNearPlane.v[0] + normalizedDirection.v[0] * t;
    intersectPoint.v[1] = pointOnNearPlane.v[1] + normalizedDirection.v[1] * t;
    intersectPoint.v[2] = pointOnNearPlane.v[2] + normalizedDirection.v[2] * t;

    point.x = intersectPoint.v[0];
    point.y = intersectPoint.v[1];
}
longshot