What is the size of the time variable that are used to show datetime in bits ? Actually in time.h file the time variable that are used, are they store into int variable or what ?
A:
On GCC (echo '#include <time.h>' | gcc -E -
):
__extension__ typedef long int __time_t;
typedef __time_t time_t;
Platform details:
Linux stanley 2.6.32-24-generic-pae #39-Ubuntu SMP Wed Jul 28 07:39:26 UTC 2010 i686 GNU/Linux
Matt Joiner
2010-08-12 07:24:52
printf("%i", sizeof(time_t));
nacho4d
2010-08-12 07:27:00
@nacho4d Don't use `%i` to print the value of `sizeof`. Use `%zu` (C99) or `%lu` and cast the value to `unsigned long` (portable).
schot
2010-08-12 09:04:20
+2
A:
Use sizeof( time_t );
to determine the size in bytes. Then multiply this number with bits per byte (usually 8, but depends on your HW).
PeterK
2010-08-12 07:26:10
Hint: if you are developing with Microsoft Visual C++ then you can `#define _USE_32_BIT_TIME_T` to enforce time_t to be 4 bytes.
Robert
2010-08-12 14:26:24
+1
A:
From the C99 standard (7.23.1):
"The range and precision of times representable in clock_t and time_t are
implementation-defined."
From the standard's perspective it might be an integer, floating point number, Huffman encoded , etc. In practice on most UNIX-like systems it will be a 32 or 64 bit unsigned integer signifing the number of seconds since the UNIX epoch (midnight Januari 1 1970).
schot
2010-08-12 07:28:31