Apart from using (byte[]) in streaming I don't really see byte and short used much. On the other hand I have seen long used where the actual value is |100| and byte would be more appropriate. Is this a consequence of the relative inexpensive nature of memory now or is this just minutia that developers needn't worry about?
I think in most applications short has no domain meaning, so it makes more sense to use Integer.
They are used when programming for embedded devices that are short on memory or disk space. Such as appliances and other electronic devices.
Byte is also used in low level web programming, where you send requests to web servers using headers, etc.
I would most often use the short
and byte
types when working with binary formats and DataInput/DataOutput instances. If the spec says the next value is an 8bit or 16bit value and there's no value in promoting them to int
(perhaps they're bit flags), they are an obvious choice.
short
and others are often used for storing image data. Note that it is the number of bits which is really important, not the arithmetic properties (which just cause promotion to int
or better.
short
is also used as array indexes in JavaCard (1.0 and 2.0, IIRC, but not 3.0 which also has an HTTP stack and web services).
The byte
datatype is frequently used when dealing with raw data from a file or network connection, though it is mostly used as byte[]
. The short
and short[]
types are often used in connection with GUIs and image processing.
The primary reason for using byte
or short
is one of clarity; i.e. the program states categorically that only 8 or 16 bits are to be used.
You don't achieve any space saving by using byte
or short
instead of int
, because most Java implementations align stack variables and object members on word boundaries. Primitive array types are handled differently; i.e. elements of boolean
, byte
, char
and short
arrays are byte aligned. But unless the arrays are large in size or large in number, they doesn't make any significant contribution to the app's overall memory usage.
So I guess that the main reason that developers don't use byte
or short
as much as you (a C developer?) might expect is that it really doesn't make much (or sometimes any) difference. Java developers tend not to obsess over memory usage :-).
In a 64-bit processor, the registers are all 64-bit so if your local variable is assigned to a register and is a boolean, byte, short, char, int, float, double or long it doesn't use memory and doesn't save any resources. Objects are 8-byte aligned so they always take up a multiple of 8-byte in memory. This means Boolean, Byte, Short, Character, Integer, Long , Float and Double, AtomicBoolean, AtomicInteger, AtomicLong, AtomicReference all use the same amount of memory.
As has been noted, short types are used for arrays and reading/writing data formats. Even then short is not used very often IMHO.
Its also worth noting that a GB cost about £80 in a server, so a MB is about 8 pence and a KB is about 0.008 pence. The difference between byte and long is about 0.00006 pence. Your time is worth more than that. esp if you ever have a bug which resulted from having a data type which was too small.
Arithmetic on byte
s and short
s is more awkward than with int
s. For example, if b1
and b2
are two byte
variables, you can't write byte b3 = b1 + b2
to add them. This is because Java never does arithmetic internally in anything smaller than an int
, so the expression b1 + b2
has type int
even though it is only adding two byte
values. You'd have to write byte b3 = (byte) (b1 + b2)
instead.
byte[] happens all the time; buffers, specifically for networks, files, graphics, serialization, etc.
I used short
extensively when creating an emulator based on a 16-bit architecture. I considered using char
so I could have stuff unsigned but the spirit of using a real integer type won out in the end.
edit: regarding the inevitable question about what I did when I needed the most significant bit: with the thing I was emulating it happened to almost never get used. In the few places it was used, I just used bitwise modifiers or math hackery.