It is not recommended to use java.util.Date
or System.currentTimeMillis()
to measure elapsed times. These dates are not guaranteed to be monotonic and will changes occur when the system clock is modified (eg when corrected from server). In probability this will happen rarely, but why not code a better solution rather than worrying about possibly negative or very large changes?
Instead I would recommend using System.nanoTime()
.
long t1 = System.nanoTime();
long t2 = System.nanoTime();
long elapsedTimeInSeconds = (t2 - t1) / 1000000000;
EDIT
For more information about monoticity see the answer to a related question I asked, where possible nanoTime uses a monotonic clock. I have tested but only using Windows XP, Java 1.6 and modifying the clock whereby nanoTime
was monotonic and currentTimeMillis
wasn't.
Also from Java's Real time doc's:
Q: 50. Is the time returned via the
real-time clock of better resolution
than that returned by
System.nanoTime()?
The real-time clock and
System.nanoTime() are both based on
the same system call and thus the same
clock.
With Java RTS, all time-based APIs
(for example, Timers, Periodic
Threads, Deadline Monitoring, and so
forth) are based on the
high-resolution timer. And, together
with real-time priorities, they can
ensure that the appropriate code will
be executed at the right time for
real-time constraints. In contrast,
ordinary Java SE APIs offer just a few
methods capable of handling
high-resolution times, with no
guarantee of execution at a given
time. Using System.nanoTime() between
various points in the code to perform
elapsed time measurements should
always be accurate.