The number of chars actually used obviously depends on the value: if time_stamp_for_file_name
is 0 you only actually need 2 bytes. If there's any doubt, you can use snprintf
, which tells you how much space you need:
int len = snprinf(0, 0, "%ld", (long)time_stamp_for_file_name) + 1;
char *tmp = malloc(len);
if (tmp == 0) { /* handle error */ }
snprintf(tmp, len, "%ld", (long)time_stamp_for_file_name);
Beware implementations where snprintf
returns -1 for insufficient space, rather than the space required.
As Paul R says, though, you can figure out a fixed upper bound based on the size of long
on your implementation. That way you avoid dynamic allocation entirely. For example:
#define LONG_LEN (((sizeof(long)*CHAR_BIT)/3)+2)
(based on the fact that the base-2 log of 10 is greater than 3). That +2 gives you 1 for the minus sign and 1 for the fact that integer division rounds down. You'd need another 1 for the nul terminator.
Or:
#define STRINGIFY(ARG) #ARG
#define EXPAND_AND_STRINGIFY(ARG) STRINGIFY(ARG)
#define VERBOSE_LONG EXPAND_AND_STRINGIFY(LONG_MIN)
#define LONG_LEN sizeof(VERBOSE_LONG)
char tmp[LONG_LEN];
sprintf(tmp, "%ld", (long)time_stamp_for_file_name);
VERBOSE_LONG
might be a slightly bigger string than you actually need. On my compiler it's (-2147483647L-1)
. I'm not sure whether LONG_MIN
can expand to something like a hex literal or a compiler intrinsic, but if so then it could be too short, and this trick won't work. It's easy enough to unit-test, though.
If you want a tight upper bound to cover all possibilities within the standard, up to a certain limit, you could try something like this:
#if LONG_MAX <= 2147483647L
#define LONG_LEN 11
#else
#if LONG_MAX <= 4294967295L
#define LONG_LEN 11
#else
#if LONG_MAX <= 8589934591L
... etc, add more clauses as new architectures are
invented with bigger longs
#endif
#endif
#endif
But I doubt it's worth it: better just to define it in some kind of portability header and configure it manually for new platforms.