About raytracing; raytracing is cool, but raytracing in the 'standard' way doesn't give you realistic lighting since rays are casted from the camera (the position of your eyes when you sit in front of your monitor) through the viewplane (your computer-screen) to see where they end up.
In the real world, it doesn't work that way.
You don't emit radar/sonar-rays from your eyes and check what they hit;
instead other objects emit energy and sometimes this energy ends up on your retina.
The proper way to calculate lighting would therefore be something like photonmapping, where every lightsource emits energy that gets transferred trough media (air, water) and reflects/refracts across/through materials.
Think about it - shooting a ray from the camera through a pixel on your screen gives you a single direction thay you'll check for light-intensity, while in reality light could come at lots of different angles to end up at that same pixel.
So 'standard' raytracing doesn't give you light-scattering effects, unless you implement a special hack to take that into account.
And aren't hacks the exact reason why people want to use another way besides polygon-rasterizing anyway ?
Raytracing isn't the final solution.
The only real solution is an infinite process where lights emit energy that bounces around the scene and if you're lucky ends up on your camera-lense.
Since an infinite process is pretty hard to simulate, we'll have to approximate it at one point or another.
Games use hacks to make stuff looks good, but in the end every other rasterizer / renderer / tracer / whatever strategy has to implement a limit - a hack - at some point.
The important thing is - does it really matter ?
Are we going for a 100% simulation of real life, or is it good enough to calculate a picture that looks 100% real, whatever the technique being used ?
If you can't tell if a picture is real or CGI, does it matter what method or hacks have been used ?