I have always used the following method for years:
#define VALUE 5
setup()
{
pinMode(VALUE,OUTPUT);
}
This is how Arduino always seems do it:
int value = 5
setup()
{
pinMode(value,OUTPUT);
}
I do understand that a variable can be dynamically modified and a definition cannot. But if the value does not get modified within the program, as with a pin assignment, why set it up as a variable? And the fact that as a definition cannot be accidently changed within the program, is a benefit in this case.
Also why does Arduino declare a port pin as an int when it has far less than 255 possible values? Wouldn't using unsigned char save a byte of program space? I know that sounds silly but using int for every variable can add up in wasted bytes. Boreland DOS C did the same thing, int as a default, but I always declare variables for what they are to hold bit length wise.
I believe the option in post#2 will not result in allocation of ram, as the constant is known at compile time and can be inserted directly in the code.
whereas, obviously, simply
byte value = 5;
must consume RAM, as there is no guarantee of no change.
Well, I could be wrong, but I believe the compiler has the option to optimize the code in that way, but that it may not be a guarantee that no RAM is used.
Otherwise, why not splurge, and use const long value = 5;?
Regardless, I may be wrong, and I'll happily cede to the "software people" when they arrive.
C++ still struggling with the legacy of decisions made in 1972...
Modern compilers are really good at optimizing away stuff that is not needed. That is good, because the programmer can specify the intent more clearly without worrying too much about bloat.
Ideally, the preprocessor should be sent to the dustbin of history, but unfortunately I have seen some compilers allocate RAM for "const int" so we have to be careful.
Even so, that's an OK tradeoff for getting the strong typing and error checking of the compiler instead of the simple string substitution of the pre-processor.
Meh. The preprocessor does text substitution, which allows various useful behaviors that are not supported (or are very obscure) using just the C++ language constructs.
But for constants, const <type> var = val; is preferred.
There are some serious differences between #define and const int
A #define does not allow type checking where the const int does.
The const int is therefore more robust / safe.
a #define allows command line overriding of the value. E.g. if you want to define the size of a buffer a const int needs editing source code, while #define allows a -Dxx=yy parameter in command line.
a #define is a preprocessor command and has purposes e.g. conditional coding for multiplatform support which cannot be done with a const int.
So I think best practices are to use const type when you need a constant number or string, and only use #define if you want it to be command line overridable.
A combination exist, to have a type checked, command line overridable constant.
#define CMD_LINE_ARRAY_SIZE 5
const int arrSize = CMD_LINE_ARRAY_SIZE;
Never used it and I use #defines quite a lot for constants, so mea culpa
True, but like many things, should be used in moderation, otherwise it leads to technical debt and maintenance nightmare. Less of a problem nowadays with CI, but even projects where there are just 2 targets I struggle to get devs to make changes without breaking the other target.
Then of course management ask "hey I thought the code could run on any target, so why can't we release just one firmware file to manufacturing which runs on all variants?"
so separate code bases for each target even though 90+% of the code is the same and will now require that identical changes be made to each target when there's a bug fix?
if you don't understand how to use the tool, don't use it. But don't condem it for the majority of people who do know how to properly use it
I was really referring to the preprocessor in general. #defines are ok for simple literals, but then you can use other (better) methods. Macros defining pure functions can be mostly replaced with inline functions. More "tricky" uses of macros should be avoided for obvious reasons.
It's all very well saying "it's ok if you are know what you are doing", I don't think I've ever met a developer who fully understands the language standard. Often those that think they do are surprised to find they rely on undefined behavior.
In 1972, the preprocessor was a neat trick. In 2025, not so much.