Convert Decimal to 2 bytes, I'm stumped

I'm always a little nervous about shift operations on signed integers. I'd use unsigned:

int Dec2Hex (uint16_t DecimalVal, uint8_t &MSB, uint8_t &LSB)
{
    LSB = DecimalVal & 0xFF;
    MSB = (DecimalVal >> 8) & 0xFF;
}

Both names are incorrect and meaningless in this application. You're simply separating the MSB and LSB of a 16-bit (unsigned) integer into two variables to transfer over a byte-oriented interface. Nothing is being converted. Countless newbies get wrapped around the axle over Hex, Decimal, etc. As mentioned above, these are merely human-readable presentations of binary values.

1 Like