No space savings when using uint8_t ?

Hey,

I am trying to free up some space in my programm and started to use uint8_t and uint16_t for variables that do not need full 32 bit integer, but when I am compiling my code there is no benefit neither in the program storage nor in the storage for global variables.

Am I missing something or will the ide just convert everything the same way as if i would use auto as datatyp for varibales?

constexpr int POTI_MIN = 30;
constexpr uint16_t POTI_MAX = 975;
constexpr int POTI_BRIGHT_MIN = 70;
constexpr uint8_t POTI_BRIGHT_MAX = 254;
1 Like

My guess is that since those are compile-time values, the compiler was previously optimizing the space usage.

You can effectively save considerable memory space by using appropriately chosen variable sizes for read/write data arrays.

3 Likes

Yes, you didn't say which Arduino (or other "compatible" MCU board) you are using.

1 Like

Mi guess is that we are facing yet another XY problem

2 Likes

@e1m1

most likely the compiler has already optimized your constant expressions before. Hence you don't see a difference in memory usage.

Very likely indeed.

As those are constant, the compiler and optimiser know exactly what to expect in terms of value and probably got rid of those altogether and replaced them directly in the code.

Depending on the architecture, some constants might also go in Flash memory and not use any RAM for storage.

a not original nano baord.

How did you declare these variables previously ? A classic Nano is an 8 bit MCU and an int on this platform is 16 bits wide. Unless you used long or unsigned long then only the unit8_t declaration above could make a difference.

If you are having problems with SRAM usage (on say the ATmega328P) then usually the biggest benefits come from using the F() macro for constant character strings.

I'll repeat my mantra/rant that it's 2025 and no one should be spending time optimizing code/memory to fit into a microcontroller unless you're building 1,000,000+ units.

If you find yourself worrying about variable sizes, it's most likely that you're doing something with the wrong processor for the job. Upgrade to a bigger one.

1 Like

You are maybe right, but I am new to programming I think it cannot be wrong to learn a bunch of different stuff and the beginning of coding especially when e.g. reading code written by others. I stumbled upon the int byte differences on my first project where i substituted ints with bytes where necessary which saved space.

After reading code written by others I wondered what the meaning behind uint8_t was and I learned that this is the "right" way to code 8bit integer variables. and so on.

It's a valid question to ask on a forum. There a plenty of 8 bit micros out there that don't have much ram or rom.

Some background information:

When the C language was first developed, the design team wanted the makers of compilers as much freedom as possible, so that a compiler could be tailor made for a specific processor. That meant that int was defined as the data width of that processor. This, however, meant that a program written in C might act differently on different computers, so a programmer needs to know what computer will be used (less portability).

Pretty soon the design team decided that 8 bits was ridiculous low and decreed that it was to be 16 bit minimum.

Later on the system with uint8_t was invented to allow programmers more control over data formats, independent of which computer.

The Arduino IDE was developed for 8 bit processors. The compiler will use 16 bits if a variable was declared as int as this is required by the language standards. The difference between int and int8_t will show in RAM usage.

If a number is declared as a constant the compiler will try to work it into the object code. Also a constant can be stored in FLASH memory by using the keyword PROGMEM (for some processors the Arduino IDE will do that automatically).

Not very helpful as we still do not know what type of clone you are using. In other words we need to know the equivalent Arduino board you are using.

For saving memory you can simply see in the used memory report, use the "F" macro when printing strings.

https://www.baldengineer.com/arduino-f-macro.html

in these cases the variables weren't replaced by the compiler (as it wasn't declared as const nor constexpr) - so you gained all the benefits of using the proper (smaller) variable size.

If you are searching for further ways how to reduce SRAM usage on your Nano, post your code in code tags.

That’s not totally accurate.

Indeed the very first definition of the C language, as developed by Dennis Ritchie in the early 1970s at Bell Labs, did not explicitly impose a minimum size of 16 bits for int. In those early days, C was designed to be flexible and closely tied to the hardware of the PDP-11, which was a 16-bit machine. As a result, int was typically 16 bits on that system, but this was more a reflection of the hardware than a language requirement.

Before the ANSI C standard (C89), there were no formal specifications about the minimum size of int. The early C compilers followed the conventions of the systems they were built for, which is why int was often 16 bits.

It wasn’t until the development of the ANSI C standard that more formal requirements, such as int being at least 16 bits, were codified to ensure a minimum level of portability and consistency.

But even before the standard, I am not aware of any C compiler implementing the int type as 8 bits. While 8-bit microcontrollers and processors existed / exist, C compilers for these systems typically implemented int as 16 bits to provide a more practical range for arithmetic operations.

(Developers used char when they wanted 8 bits which could be either signed or unsigned depending on the system which could lead to portability issues).

1 Like

There's actually a compiler switch for avr-gcc to cause "int" to be 8 bits...

1 Like

thx - interesting (as nowadays it goes against the standard)

I worked with programmers who were taught to not improve their own code because faster PC's would be made.

NONE of them was other than mediocre. NONE of them generalized code. They were ALL happy to re-invent the same basic apps over and over.

And storage (both RAM and disks) would become larger and cheaper,
display got larger and got more colors
Modems got faster

True Rob but consider, if those CompSci degree holders (in 1987-8) improved themselves, they would be able to get more out of the new stuff.

For me, I look into improving some code and that often gives me a view, an arrangement that can be exploited greatly beyond the way I did before often using standard methods.

I see a lot of waste pushed with new hardware. That to me is room to go beyond expectations and not be mundane. I have slowed down but I was squeezing more from PC's since the early 80's when shortening waits got me thank-you's. Even after 1990, I made 2nd wave hardware outrun the new stuff and keep 2 projects within budget.

2 Likes