Hi,
I'm building a tool in flash to tell me how long it takes to load a bit of content off of a server. I'm doing
foo = new Date().getTime();
// get the thing
bar = new Date().getTime();
trace(bar-foo);
Yet, somehow, the times just seem to fluctuate HUGELY, from 3ms to 150ms. When I test this using Python, the times are consistently around 5ms. Does anyone know what's going on, and how I fix it?!
Thanks very much!