Still learning here, and not great at these bitwise operations. I have a protocol that specifies data be sent as 2 bytes, however the 2 most significant bits of any byte are reserved. That is, the two first bits must always be zero. So 00111111 is valid, but 11111111 is not.
That said, they're still 2-byte values. So how do I spit out a value like this for any given int? Up to 4095, I presume (12 bits total). Say I want to send 1000 for example, that would be 00000111 00101000. How do I do this programmatically in code?
BenF said, "To get MSB (top 8 bits) from a 16-bit variable you need to shift 8 bits right (not 6). "
My first reaction was, oops! He's right. Then I realized that we weren't trying to get the top 8 bits from a 16-bit variable, we were trying to get the top two bits of the least-significant byte and the bottom six bits of the most-significant byte. That's because we are trying to output the lower 12 bits of the variable, split into two bytes, with each byte having 6 of the 12 bits.
Each letter represents one bit, "A" is the least significant bit:
PONMLKJI HGFEDCBA
The low order byte we send out will have the bits:
FEDCBA
For the high order byte, we want to send:
LKJIHG
So we need to shift it down 6 positions.
-Mike
P.S. You're absolutely right that the data type for LSB and MSB needs to be "byte".
I would always include the 16-bit typecast because as you suggests the result (potential bit loss) would otherwise depend on the compiler - and we don't want that (an alternative safe approach is to first move MSB to 'a' and then do the left shift).
If we don't know the value of MSB/LSB I would aslo add code to force the two upper bits to '0' as in the following: