This may be waaay to specific for SO, but there seems to be a dearth of info on the sun raster standard. (Even JWZ is frustrated by this!)
Intro: The Sun raster standard says that rows of pixels have padding at the end such that the number of bits in a row is a factor of 16 (i.e. an even number of bytes). For example, if you had a 7-pixel-wide 24-bit image, a row would normally take 7 * 3 = 21 bytes, but sun raster would pad it to 22 bytes so the number of bits is divisible by 16. The code below achieves this for 24-bit images of arbitrary width:
row_byte_length = num_cols * 3;
row_byte_length += width_in_bytes % 2;
Here's my question: both Imagemagick and Gimp follow this rule for 24-bit images, but for 32-bit images it does something weird that I don't understand. Since the bit depth gives 4-byte pixels, any image width would take an even number of bytes per row, which always complies with the "16-bit alignment" rule. But when they compute the row length, they add an extra byte for images with odd widths, making the row length odd (i.e. the number of bits for the row is not divisible by 16). The code below describes what they're doing for 32-bit images:
row_byte_length = num_cols * 4 + num_cols % 2;
Adding one appears to go against the "16-bit alignment" rule as specified by the sun format, and is done with no apparent purpose. However, I'm sure if Gimp and Imagemagick do it this way, I must be misreading the sun raster spec.
Are there any Sun raster experts out there who know why this is done?
Edit My mistake, Gimp only outputs up to 24 bit Sun raster. Looks like this is only an Imagemagick issue, so probably a bug. I'm labeling this for closure; better to discuss on the ImageMagick forums.