time.clock() has 13 decimal points on Windows but only two on Linux.
time.time() has 17 decimals on Linux and 16 on Windows but the actual precision is different.
Described here
http://docs.python.org/library/time.html
I don't agree with the documentation that time.clock() should be used for benchmarking on Unix/Linux. It is not precise enough.
So what timer to use depends on operating system.
On Linux the time resolution is high in time.time()
time.time(), time.time()
(1281384913.4374139, 1281384913.4374161)
On Windows however the time functions seems to use the last called number
time.time()-int(time.time()), time.time()-int(time.time()), time.time()-time.time()
(0.9570000171661377, 0.9570000171661377, 0.0)
Even if I write the calls on different lines in Windows it still returns the same value so the real precision is lower.
So in serious measurements a platform check (import platform, platform.system()) has to be done in order to determine whether to use time.clock() or time.time()
(Tested on Windows 7 and Ubuntu 9.10 with python 2.6 and 3.1)