In openGL, I have a 3D model I'm performing a ray-triangle intersection on, using the code explained in the paper "Fast, Minimum Storage Ray/Triangle Intersection" ( http://jgt.akpeters.com/papers/MollerTrumbore97/ ).
My cursor position is unprojected into world space using the following code:
bool SCamera::unproject(Vector3 input, Vector3 & output){
GLint viewport[4];
glGetIntegerv(GL_VIEWPORT,viewport); //Grab screen info
float x = input.mX;
float y = input.mY;
float z = input.mZ;
Matrix4 P, Mv, res;
P = getProjection();
Mv = getCameraTransform();
Vector3 N; //Cursor point translated to having 0,0 at screen center
N.mX = ((x-viewport[0]) / viewport[2])*2 - 1;
N.mY = ((y-viewport[1]) / viewport[3])*2 - 1;
N.mZ = z*2-1;
res = P * Mv; //Multiply P * Mv to get transform
Vector3 w = res.inverse() * N; //Apply transform to N.
output.mX = w[0];
output.mY = w[1];
output.mZ = w[2];
return true;
}
After that, I form a ray by doing the following:
unproject(Vector3(xClick, yClick,0),resultUnproject)
ray.origin = cameraPosition;
ray.direction = resultUnproject - ray.origin;
ray.direction.normalize();
Now, finally I'm trying to run this ray through the triangle code (linked above), but I can't seem to transform it right. My current attempt is as follows:
Matrix4 mview, T;
mview = getModelview();
T = mview.inverse();
ray.origin = T*ray.origin;
ray.direction = T*ray.direction;
ray.direction.normalize();
For some reason, this doesn't work. Am I forming my ray wrong? Or transforming it wrong?