I am currently trying to convert a touch from screenspace, to where it is in worldspace for a 2D game I am working on.
The view I am working on has nearly the same coordinate system as the screen, basically if someone touches the x = 345, y = 500 pixel on the screen, it will be the same on the view, although the y will be backwards because opengl uses the lower left corner for 0 instead of the upper left.
The "camera" is at 0,0,0 looking down negative Z. I say "Camera" since I haven't coded one yet. Right now I am just translating every sprites Z to -100.
The pseudo code I have tried thus far( and have been double checking in mathematica ) is this -
// scale the screen point into a value from 0 to 1
point = { screenPoint.x / screenWidth, screenPoint.y / screenHeight, -100, 1 }
// doing a full inverse here, not just swapping rows and cols
out = Inverse(viewProjection) * point
inverseW = 1 / out.w
finalWorldCoord = out * inverseW
The issue is that this is giving me values that are way less than what they should be, and I am not sure why.
This is with OpenGL ES 2.0, on iPhone OS 3.2.
Does anyone know the correct way to do this?