bar talk, no stupid question? if I see "int" does code not care re efficiency?

so this is kind of a 'bar talk' question, so people don't jump down my throat for asking such a stupid beginner question.

I'm looking at 8-bit microcontroller code and have started to realize that normal "int"-s that we might see in Arduino code actually get translated into several registers (if I am correct) since the micro is just 8-bit. (please correct me if I'm wrong.).

so does this mean if I see, for example, someone counting to a hundred using "int" does this mean I'm not looking at code that's trying to be particularly efficient? Not explicitly "microcontroller" code?

Like, is it being very wasteful, and if I do that, it means I don't care how many instructions my code turns into?

because the compiler doesn't know that you're only using it to count to a hundred?

If you use an int, how much less efficient is it than using a smaller data types? Please forgive me if I've made a mistake in my understanding.

Yes, using int where a byte will suffice is less efficient.

Use byte to count from 0 to 255, or to assign pin numbers, or to use 0/1 as a flag. Use int to count higher - -32768 to 32767, or unsigned int for 0 to 65535. Use long for even higher - +/- 2 billion (roughly), unsigned long for 4 billion (2^32-1) and for time elements (anything millis & micros related). An int is 2 bytes - why use 2 when 1 will do, especially for changeable values that get stored in the 2048 of total SRAM? A long is 4 bytes.

if I see someone counting to a hundred using "int" does this mean I'm not looking at code that's trying to be particularly efficient?

Correct. Don't forget the maxim: "premature optimization is the root of all evil." I would guess that in the Arduino world, problems due to using too small an integer size (byte instead of int, or int instead of long (because the user didn't really understand the limits)) are MUCH more common that situations where using an unnecessarily large integer pushes sketches over an edge into "insufficient performance" territory. Even the slower Arduinos are pretty fast compared to prior "user friendly" single-board microcontroller systems. (probably hundreds of times faster than a Basic Stamp or BASIC-52 system, for instance.)

(When you take chemistry or physics, there used to be a lot of "history" that you were expected to learn as well. Discovery of the Elements, early investigations into the nature of electricity, what came prior and how and why people decided that things were different. I wonder if "computer science" has reached a stage where a proper education should include more explicit historical information than currently seems to be taught...)