Why does Hex not read out in an array for LED "clock"?

so rather than punching in a 32 bit binary for a single "column", I could simple type in something like 0xFF.

And so you can (the leading 24bits would be padded with zeroes in this case).
Hex or binary (or octal or decimal), the compiler doesn't care, they're all just a convenience and all end up as binary.