#define is used to create a name/value pair. It does not define a variable.
#define LED_PIN 13
creates a name, LED_PIN, and a value, 13. Wherever the name appears later, the value will be substituted before the compiler runs.
what is the difference between char and char* in defining a variable?...
A char is a variable that can hold one character. A char * is a pointer to a memory location that can hold one or more chars. It must be made to actually point to some memory before it can be used.
The #define is not declaring or defining a variable. It is simply a substitution rule.
#define foo bar
then when foo shows up in subsequent code, it gets replaced by bar. Normally only complete match is replaced so foobar won't be replaced by barbar. Also foo inside a text string is not replaced.
This command is called a preprocessor directive. The substitution occurs before the code is compiled.
There are a lot of good use for this. If you want to blink an LED, you can do:
But that would suck if you want to blink a different pin. You have to change pin number in multiple lines and pray not to make a mistake. But this will work much better.
#define Led 13
digitalWrite(led,HIGH);
delay(1000);
digitalWrite(led,LOW);
delay(1000);
But that would suck if you want to blink a different pin. You have to change pin number in multiple lines and pray not to make a mistake. But this will work much better.
why not simply define a variable int led=13...? memory usage?
A char is a variable that can hold one character. A char * is a pointer to a memory location that can hold one or more chars. It must be made to actually point to some memory before it can be used.
could you please make an example in using char*...?
But that would suck if you want to blink a different pin. You have to change pin number in multiple lines and pray not to make a mistake. But this will work much better.
why not simply define a variable int led=13...? memory usage?
Well it's not really "variable".... so,
Another way to achieve similar is to declare it as a constant
const int Led=13;
This prevents you from assigning another value to it during the execution of your program, and allows the compiler (not preprocessor) to optimize/substitute it as it sees fit.
There are pros/cons of this versus #define, though I feel that the const (and its cousin "inline") are preferred in most cases.
consts and #defines are useful but there is a better way, which will ensure the compiler considers it a compile time constant.
Use an enum, they are effectively more constant than a variable marked const. Sometimes the compiler cannot always guarantee a variable is a compile time constant, whereas an enum is by nature.
pYro_65:
Sometimes the compiler cannot always guarantee a variable is a compile time constant, whereas an enum is by nature.
Can you explain that? I know that const-ness can be cast away or lost in some situations, but I would have thought that a const instance of an Enum type can lose its const-ness in exactly the same way that a const instance of an int type can.
Here's an idea. There are a million and one books which teach C++ programming, some of them free.
Read one.
I'm actually trying to get one!
So, #define, just operates a substitution: #define var 11, means that every time i want to call a specific connection to "11" (pins number or just a number inside an equation) i'll insert var...
pYro_65:
Sometimes the compiler cannot always guarantee a variable is a compile time constant, whereas an enum is by nature.
Can you explain that? I know that const-ness can be cast away or lost in some situations, but I would have thought that a const instance of an Enum type can lose its const-ness in exactly the same way that a const instance of an int type can.
I'll have a go.
I'm talking about an enum type, not an instantiated object of it. Any variable marked const ( including enum types ) can have the const cast away using 'const_cast<>()'
The values used to define an enums range must be compile time constants themselves, as enum type values are usable in templates.
All three of these are interchangeable, then main difference is the const char is a 'read-only object', the enum is a constant expression, and we all have hopefully heard of some compelling reasons why macros are not so great.
I prefer to keep large sets of enum values named, so I use a class based enum.
And as you can see, there is no need to instantiate a variable of the enum type.
Also as a class based approach you can define methods/functions specific to that enum.
On a small side note: enums can be defined in macros, whereas you cannot define further #defines in a macro.