Great point pert! I'll just repeat that you don't save memory using either. Its a trade off between code space vs. data space. With #define the resource itself (the value) gets scattered all over the code wherever it was used. This makes the code bigger since a copy of the value becomes part of every code instruction that used it, but makes the data usage smaller since you aren't using it at all. Run time plays a part in the decision too. You gain a few clocks using #define.
I must also note that #define is far less safe. By today's standards, I would say not safe at all.
Example:
// Header1:
#ifndef Boat_h
#define Boat_h
#define DIM 0x20
class Boat
{ ...
};
#endif
// Header2:
#ifndef Car_h
#define Car_h
#define DIM 0x40
class Car
{ ...
};
#endif
// Main
#include "Boat.h"
#include "Car.h"
dimTheCar( DIM ); <--- DIM is 0x40
// Reverse the #includes...
#include "Car.h"
#include "Boat.h"
...
dimTheCar( DIM ); <--- DIM is 0x20
...
The compiler won't bitch cuz you are talking to the preprocessor not the compiler using #define. It doesn't care if you give names a different value. There are no constant #defines. In fact, it's a preprocessor feature to redefine things. Lots of times where it's sensible to make preprocessing work. Particularly for multiple platforms. When it hits the 2nd def from the 2nd header, it does not crap out. It simply changes the value to the 2nd value & starts substituting that one from then on (until it changes again). The compiler never sees this cuz when the code gets there all names have been discarded. The compiler has no idea there are two defs. It has no idea if there are any defs. It thinks there are none. It only sees numbers plugged into the instructions.
This behavior means the compiled code will be functionally different depending on the ordering of #includes. Catastrophic news. And quite hard to locate, plowing through a lot of libraries...assuming you even notice it.
The IDE does not protect you from yourself when you use #define. You really have to be diligent about creating unique names.
Naturally when you type a variable & make it const, the code makes it to the compiler intact names & all & so it is able to protect you from using it incorrectly. Plus, as pert said, you can type the data to use less space sometimes.
Another memory factor is how often the thing gets used. When you use a const or a variable it always takes up the same predictable amount of space. One unit of whatever datatype. In the code where it is used, you end up with memory addresses getting substituted in place of names. Using it all over the place doesn't cause any extra resource usage since the resource itself exists only in one place. For #define whatever you define just gets plopped in directly replacing the name. At runtime the processor is able to use immediate mode to get the value (it is part of the code now so no need to poll memory). This is faster to execute, by the way. But if you #define something complex, & then use it generously, you end up with lots of repeated copies quickly adding up to more than a single memory slot uses. It shows up in the code space not the data space, but total memory usage will be more with #define in that case. Basically you are spreading copies of the resource all over the place. It exists everywhere it is needed. The more often you need to use it, the better it is to stay away from #define. But if you only use it in a single file in a couple of places, the difference will be negligible & maybe even a tad less memory used. On a platform like Arduino, this is the more common thing you will see simply because the code cannot be physically large. Tight resource limits keep the code small & simple & therefore way less likely for this decision to have a significant impact. If it isn't significant, you can safely ignore it.
We have devolved now into one of those areas where it is really up to the coder to decide how reliant on the compiler he is going to be. In the old days compilers didn't do much more than translate. If we wanted a constant there was no preprocessor & therefore no #define. There was no const, only variable variables. If you wanted a constant you had to make a variable & then never make the mistake of changing its value ever. You can still do that, but nowadays you get lots of options like const which basically allow you to ask the compiler to do some of your thinking for you. I dislike relying on computers to think for me. Mainly because they are such astonishingly stupid things. They only do precisely what you tell them...never what you don't. Even when it's obvious. You can write this line anywhere:
1 == 0;
The computer won't even blink. It's too damn stupid to reason it out. It just says "false" & moves on. A person looking at that will be like WTF is this doing here??? It may be able to think orders of magnitude faster than I can, but even the most moronically simple nonsense I can conjure will be orders of magnitude more reasoned than anything it thinks up in 1/10000 the time. I am uncomfortable relying on such a brain to decide how I am or should be implementing my solution. In fact, most of the frustration people endure when dealing with compilers, & there never existed a person who didn't, is due to the fact that the computer is too fuckin stupid to figure out what you're telling it. Because you just aren't being clear enough for its feeble mind to grasp. Nobody has told it how to handle vagueness, so it doesn't. Stupid.
My attitude is the best way to avoid relying on the compiler to keep you from programming like a dumbass is to learn not to program like a dumbass. Not to rely on an idiot to point out your mistakes. In the end you will be a better programmer if you tend to prefer doing things yourself. I've been a programmer since 1977. We had almost nothing to help us back then. Learning in that environment made me adept at things like name conflicts. Today, I don't need help with that...I got it wired so I don't concern myself with it. I don't think I have ever once sat down & said "I'm going to need a namespace for this" before I start...or even later on. I've run into them, but just never had a need to use them. Cuz I don't write dumbass code anymore.
Actually in my own code I like static const class members because it forces me to be clear about using them & it isolates them from the rest of my code. For that class, all the instances will have a unique understanding of what DIM means. Outside, I must clearly specify which version of DIM I want because I'm obligated to use the ClassName::DIM syntax to use it. If there's another class with the same name in it, then when I want the second one... ClassName2::DIM makes it clear. I really use that feature for readability & maintainability reasons rather than as a crutch for not thinking things through completely. I get the crutch as a bonus I suppose, & I don't complain, but it ain't why I choose that feature. I like it cuz it requires clarity must be employed...something the machine is fond of.
Another bonus using static members--if you ever need the value of the constant (you always forget after awhile), it is plain where to go & find it. With #define there's no telling where the thing comes from & brute force plowing through files is the only way. Its stuff like that I'm eager to rely on the moron to help me find. It is a stupid task perfectly suited for a idiotic brain to handle.
In the end, the method you like most is nearly entirely programmer preference. In a professional setting you have to think professionally & know all the ins & outs of clarity, reusability, maintainability, reliability, resource management, scaling, performance, etc. But in a place like this, where you don't even have enough memory present on the platform to make anything really complicated, then you can get away with doing it any way you want as long as it does what you need & you understand it. The more complex you make it the more likely you will run into a shoe horning issue with memory space. In this place simpler is better, smaller is better. And #define is very simple. Therefore I think it is proper to use it pretty freely. If you run into problems, you can always add complexity, believe me that is trivial.
By the way pert, you probably have forgotten that you can type a #define by writing:
#define DIM (byte)0x20;
Then you get bytes instead of ints.