Simple math doing my head in - when is an int treated like an int?

I'm trying to do something simple - convert analogue pin input (0 - 1023) to byte equivalent (0 - 255).

So why are you going about it in such a complicated way?
0..1023 represents a span of ten bits.
0..255 represents a span of eight bits.

1023 / 22 = ?