I am speed testing some JavaScript programs by creating a Date object, and using it to get the time in milliseconds before and after the real work of the function. I made the body a simple adding for loop, and then the end is subtracting the old ms from the new ms and printing that. However, everything finishes in 0 milliseconds... which makes sense as the time when I check it with a write is the same before and after the work. Am I doing it right, and JavaScript (in Chrome) is lightning fast, or is there some behind the scenes stuff that is messing up my variables.
While the Date object returns times in milliseconds, that's not actually the resolution of the timer behind it. As an example, the timer might tick over once every 10 ms. If your process only takes 3 ms, then most of the time you won't see a nonzero measurement (and sometimes you'll see 10 ms).
The solution is to run your function many times, and time the whole thing. For example, run it a million times and divide the total time by 1000000 to get the average time of one run.
We'd really need to see your code. Whatever, a possible reason for the zero is that your loop is running asynchronously: the interpreter doesn't wait to the end of the loop before jumping you the next instruction. Or, of course, your loop can just be lightning fast.
BTW, you might be interested in using a JavaScript profiler. Firebug for Firefox has a nice one. You just need to open the console and hit Profile.
See my answer to this question for how you might implement some simple benchmark comparisons. As @Greg Hewgill pointed out, it is important run the test multiple times to get an accurate representation of how long a specific test actually tests.
Typically, it boils down to something as simple as:
var MAX = 100000, i = 0,
s = null, e = null;
console.info("`someMethodToTest()` over %d iterations", MAX);
s = new Date();
do {
someMethodToTest();
} while ( ++i < MAX );
e = new Date();
console.log("Total: %dms; average: %dms", +e - +s, (+e - +s) / MAX);