I have some devices which read an RFID tage and pass the serial number of the tag over the serial port.
It seems to me that it is "better" to use two bytes for each digit of the serial number, especially since some devices are sending a terminating 0x0D 0xA (CR/LF).
Now I find one device which is using one byte per digit, so to send "12" it does not send 0x31, 0x32, but rather 0x12. Which means that I can't distinguish between CR/LF and a real 0xA and 0xD. I ask and get some waffle about it not mattering as the strings are fixed length - so why bother with CR LF? And, just to be (in)consistent, they use two bytes each for CR/LF.
As you may have gathered, I am fairly new to this project and somewhat confused by the inconsistencies between different manufacturers' devices.
The manufacturer of this one is happy to change their firmware to accommodate me. Should I ask them to use two bytes for each digit?