views:

546

answers:

2

I am trying to identify file types for directory entries (Windows Unix etc..).

In sys/stat.h the high order nybble of the st_mode word have the coded values:

#define S_IFDIR  0x4000  /* directory */
#define S_IFIFO  0x1000  /* FIFO special */
#define S_IFCHR  0x2000  /* character special */
#define S_IFBLK  0x3000  /* block special */
#define S_IFREG  0x8000  /* or just 0x0000, regular */

From the comment it seems the nybble could be either 0 or 8 to represent a 'regular file'.

So this begs the question: in what circumstances is it 0 and not 8? If I had defined these codes, I would have reserved 0 to inidicate unknown/undefined/invalid/not-a-file or something like that.

Indeed the S_ISREG macro is:

#define S_ISREG(m)  ((m) & S_IFREG)

This would seem to me to indicate that a regular file should always be expected to have the code 8 (and 0 would be an abberation?).

Would it be a valid assumption to interpret 0 as an unknown or invalid file and ignore the 'or just 0x0000' comment and always expect 8 to be used for all regular files?

+1  A: 

Most sources indicate that checking S_ISREG is enough; I'm not sure when you'd see 0x0000 as a "regular" file.

I believe some old implementations used 0x0000 (a really old DJGPP header search turns this up) but it's the only real reference I can find. Everything else points to 0x8000.

Basically, use the S_ISREG macro and hope that the header on whatever you're compiling against does the right thing.

Joe
A: 

I would trust the definitions of S_IFREG and S_ISREG. I've never worked with a file system that broke those macros.

My guess is that the 0x0000 definition for a regular file is to handle legacy file systems that may have used a different encoding of file type information. What OS and file system are you using?

Keith Smith