Hello,
as a homework assignment, we're writing a software rasterizer. I've noticed my z buffering is not working as well as it should, so I'm trying to debug it by outputting it to the screen. (Black is near, white is far away).
However, I'm getting peculiar values for the z per vertex. This is what I use to transform the points:
float Camera::GetZToPoint(Vec3 a_Point)
{
Vec3 camera_new = (m_MatRotation * a_Point) - m_Position;
return (HALFSCREEN / tanf(_RadToDeg(60.f * 0.5f)) / camera_new.z);
}
m_MatRotation
is a 3x3 matrix. Multiplying it by a vector returns a transformed vector.
I get maximum and minimum values between 0 and x, where x is a seemingly random number.
Am I doing this transformation right? If so, how can I normalize my Z values so it lies between two set points?
Thanks in advance.