I'm trying to learn a little more on vectormath through writing a simple ray tracer and I've been doing some reading on it, but what I haven't been able to find is how to determine the direction of the primary rays. This sounds like a simple problem and probably is, but with my current knowledge I haven't been able to figure it out.
I figured that I need a camera (nothing more than a location and a direction as vectors) and from the camera I fire the primary rays onto a screen in front of the camera which represents the final image. What I can't figure out are the corner coordinates of the screen. If I know the screen, finding the direction the primary rays is easy.
I'm hoping the screen can be figured out using nothing but simple math that doesn't require any rotation matrices. My best guess is this:
I have the direction of the camera as a vector, this direction is equal to the normal of the plane of the projection screen. So I have the normal of the screen, and from there I can easily calculate the center of the screen which is:
camera_location + (normal * distance)
Where distance is the distance between the screen and the camera. However, that's where I get lost and I can't find a way to figure out the corner coordinates of the plane for any arbitrary direction of the camera.
Can any of you help me out here? And if my method can't possibly work, what does?