Having reflected a bit more on this, I see that in this code:

`const long deviceBits = 10;`

const long deviceWordLength = 16;

const long deviceMask = (1 << deviceBits) - 1;

there's no need for any of the constants to be declared *long*. The counting constants, describing length of the significant data and length of the word it resides in, can obviously be byte-sized, at least until we start using 257-bit integers. The mask doesn't need to be any bigger than the input data word, in this case *int*.

Hmmph.