views:

4421

answers:

7

Hi guys,

Does anyone know how to calculate time difference in C++ in miliseconds? I used difftime (time.h) but it's not enough precision for what I'm trying to measure.

Thanks, in advance. Alejo

+1  A: 

You can use gettimeofday to get the number of microseconds since epoch. The seconds segment of the value returned by gettimeofday() is the same as that returned by time() and can be cast to a time_t and used in difftime. A millisecond is 1000 microseconds.

After you use difftime, calculate the difference in the microseconds field yourself.

SoapBox
+2  A: 

You have to use one of the more specific time structures, either timeval (microsecond-resolution) or timespec (nanosecond-resolution), but you can do it manually fairly easily:

#include <time.h>

int diff_ms(timeval t1, timeval t2)
{
    return (((t1.tv_sec - t2.tv_sec) * 1000000) + 
            (t1.tv_usec - t2.tv_usec))/1000;
}

This obviously has some problems with integer overflow if the difference in times is really large (or if you have 16-bit ints), but that's probably not a common case.

Tyler McHenry
I think you ment *1000 not *1000000
SoapBox
You might want to add +500 usec before dividing by 1000 there, so that 999usec is rounded up to 1msec not down to 0msec.
Mr.Ree
No, I did mean *1000000. It's doing the calculation in us and then converting to ms at the end. The +500 suggestion is a good one, though.
Tyler McHenry
+1  A: 

The clock function gives you a millisecond timer, but it's not the greatest. Its real resolution is going to depend on your system. You can try

#include <time.h>

int clo = clock();
//do stuff
cout << (clock() - clo) << endl;

and see how your results are.

Bill the Lizard
In my experience this usually ticks on the order of 10ms. Sometimes 5...
SoapBox
That's pretty typical on Unix and Linux systems. I think it can be as bad as about 50 ms, though.
Bill the Lizard
The CLOCKS_PER_SEC macro in <time.h> tells you how many ticks there are per second. It was classically 50 or 60, giving 20 or 16.7 ms.
Jonathan Leffler
@Jonathan: Thanks, that's good to know.
Bill the Lizard
Actually, CLOCKS_PER_SEC gives you the number of clock_t units per second. For example, you might have 1000 CLOCKS_PER_SEC (clock() returns milliseconds) yet have clock() return multiples of 16 ms. Call clock() in a tight loop and it will return: x, ..., x, x+16, ..., x+16, x+32... on my system
aib
+1  A: 

If you're looking to do benchmarking, you might want to see some of the other threads here on SO which discuss the topic.

Also, be sure you understand the difference between accuracy and precision.

Alastair
A: 

I think you will have to use something platform-specific. Hopefully that won't matter? eg. On Windows, look at QueryPerformanceCounter() which will give you something much better than milliseconds.

Peter
+1  A: 

You can get micro and nanosecond precision out of Boost.Date_Time.

Ferruccio
+1  A: 

Hi, if you are using win32 FILETIME is the most accurate that you can get: Contains a 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601 (UTC).

So if you want to calculate the difference between two times in milliseconds you do the following:

UINT64 getTime()
{
    SYSTEMTIME st;
    GetSystemTime(&st);

    FILETIME ft;
    SystemTimeToFileTime(&st, &ft);  // converts to file time format
    ULARGE_INTEGER ui;
    ui.LowPart=ft.dwLowDateTime;
    ui.HighPart=ft.dwHighDateTime;

    return ui.QuadPart;
}

int _tmain(int argc, TCHAR* argv[], TCHAR* envp[])
{
    //! Start counting time
    UINT64   start, finish;

    start=getTime();

    //do something...

    //! Stop counting elapsed time
    finish = getTime();

    //now you can calculate the difference any way that you want
    //in seconds:
    _tprintf(_T("Time elapsed executing this code: %.03f seconds."), (((float)(finish-start))/((float)10000))/1000 );
    //or in miliseconds
    _tprintf(_T("Time elapsed executing this code: %I64d seconds."), (finish-start)/10000 );
}
Nuno