When compiling a 64bit application, why does strlen() return a 64-bit integer? Am i missing somthing?
I understand that strlen() returns a size_t type, and by definition this shouldn’t change, but... Why would strlen need to return a 64-bit integer?
The function is designed to be used with strings. With that said:
Do programmers commonly create multi-gigabyte or multi-terabyte strings? If they did, wouldn’t they need a better way to determine the string length than searching for a NULL character?
I think this is ridiculous, in fact, maybe we need a StrLenAsync() function with a callback just to handle the ultra long process for searching for the NULL in the 40TB string. Sound stupid? Yea, well strlen() returns a 64-bit integer!
Of course, the proposed StrLenAsync() function is a joke.