Error when Using #define to set pins

I keep getting the following error when I try to compile:

Program:106: error: expected primary-expression before '=' token

If I declare my pins using the normal method, I don't have any errors. When I attempt to utilize "define", I get errors where I try to set the pin mode. I understand that the compiler replaces all instances of the #define "variables" with the numbers and thus saves me some memory. However, I don't understand all the subtleties here.

#include <Wire.h>
#include <avr/io.h>
#include <avr/interrupt.h>
#include <avr/pgmspace.h>
#include <StandardCplusplus.h>
#include <vector>
#include <iterator>
#include <string.h>

#define ResetCom = 1
#define StartButton = 2 //Do I want these on the primary and communicate via wire?
#define NextButton = 3 
//define some others

void setup() {
  pinMode(ResetCom, OUTPUT);
  pinMode(StartButton, INPUT);
  pinMode (NextButton, INPUT);
  pinMode(4, OUTPUT);
  // ?? Can I do this to set pins 6 through 17 as outputs:
  for (int j = 6; j <= 17; ++j){
    pinMode(j, OUTPUT);
  }
//stuff
}

void loop() {stuff}

Hello,

You don't put = signs, because you don't assign a value to a define, define's are like aliases, it just replace text by another at compile time :slight_smile:

So...

#define ResetCom 1
#define StartButton 2 //Do I want these on the primary and communicate via wire?
#define NextButton 3

But it is suggested to use constants instead of defines:

const uint8_t ResetCom = 1;
const uint8_t StartButton = 2;
const uint8_t NextButton = 3;

tms8c8:
I keep getting the following error when I try to compile:

Program:106: error: expected primary-expression before '=' token

There is no "=" used for assignment when using a #define.

#define MYPIN 1

works, while

#define MYPIN = 1

won't. Also, the other common mistake is to use a semicolon

#define MYPIN 1;

which in general will also cause many interesting problems.

Finally, good style is to use all uppercase for your #define macros, i.e., MYPIN rather than MyPin, if MyPin could be confused with a variable. Helps with readability and maintenance.

Thanks guys! Can you tell me why it is recommended to use the "const int" vs. #define?

tms8c8:
Thanks guys! Can you tell me why it is recommended to use the "const int" vs. #define?

It tells the compiler what type the variable is.

tms8c8:
Thanks guys! Can you tell me why it is recommended to use the "const int" vs. #define?

Well one good reason is you get those cryptic errors -- the message says there's a problem on entirely the wrong line and the syntax is entirely different.

Another is that they have a type.

I'm sorry; I'm not trying to be dense, but why is having a type important?

I see why using #define is inconvenient for the programmer, but are there any "performance" reasons not to use it? In my application, I am storing data in the program memory because EEPROM isn't big enough for all the data and I want it to be there after a power cycle. However, the data is large enough that memory is an issue. That is why I started looking into using the #define method. Being largely ignorant of these matters, are there any performance problems using it?

tms8c8:
I'm sorry; I'm not trying to be dense, but why is having a type important?

Because unless specified, the compiler makes assumptions about the type and this can be problematic in certain calculations that rely on the correct data type.

but are there any "performance" reasons not to use it?

No, but there are no performance reasons to use it as well. Neither #define or const take up additional memory.

tms8c8:
In my application, I am storing data in the program memory because EEPROM isn't big enough for all the data and I want it to be there after a power cycle. However, the data is large enough that memory is an issue. That is why I started looking into using the #define method.

Sounds like you'd better go back to the beginning and describe your memory space problems, because #define won't be of any help one way or the other in that respect. It is really just a general method of expressing a literal string or numeric constants in a symbolic way, to aid readability and maintainability of a program.

It really is a text processor that rewrites your source file before the compiler proper sees it. So you write

#define PI 3.14159

and write

double theta1 = PI/2.0;
double theta2 = = PI/4.0;

what the compiler will see is

double theta1 = 3.14159/2.0;
double theta2 = 3.14159/4.0;

To the compiler, it's as if PI never existed.

You can see why it is useful to avoid the sort of errors you might get by writing out the literals explicitly each time. You will probably end up getting things like

double theta1 = 3.14159/2.0;
double theta2 = 3.14259/4.0;

and have a lot of fun trying to figure out why your circles aren't exactly round...

Thank you, guys!

I don't currently have a memory problem; I'm just very conscious of the amount of memory I'm using. I saw some passing references to #define and thought it might be advantageous to use it. If I had paused to think, I would have remembered that using "const" accomplishes the same thing (in regard to memory) although this brings me to my next question!

When and why would I use #define instead of const "type" to declare variables? Is the #define a relic and using const "type" always the better choice?

I would say always use const instead of #define when you have... er... to declare a named constant :slight_smile:
Remember that #define is just like test substitution, while with const you have the usual compile-time type checking.

#define should be used to define macros like this:

#define ARY_LEN(a) (sizeof(a)/sizeof(a[0]))

IMHO.

tms8c8:
When and why would I use #define instead of const "type" to declare variables? Is the #define a relic and using const "type" always the better choice?

When you want to use conditional compilation...

#define StartButton 2

#if defined( StartButton )
  // Put StartButton code here
  pinMode( StartButton, INPUT_PULLUP );
#endif

Using something like the snippet above allows you to optionally have a StartButton. However, because the compiler does an excellent job of dead code elimination, the same thing can be accomplished using typed constants...

const boolean HaveStartButton = true;
const uint8_t StartButtonPin = 2;

if ( HaveStartButton )
{
  pinMode( StartButtonPin, INPUT_PULLUP );
}

tms8c8:
When and why would I use #define instead of const "type" to declare variables? Is the #define a relic and using const "type" always the better choice?

It really depends on the situation. A const variable is really a variable, at least so far as the compiler is concerned, and so all the code needs to be syntactically correct to compile. With the macro preprocessor, since it is just manipulating text (your source code) in various ways, it can be more flexible, particularly in cross platform programming, where the required syntax on one system won't even compile on the other.

Then, you can define things like

#define TARGET 1

#if (TARGET == 1)
#define MYMACRO code_that_will_compile_on_target_1(p1,p2,&p3)
#include <library1.h>
#else
#define MYMACRO code_that_will_compile_on_target_2(p1,ANOTHERMACRO)
#include <library2.h>
#include <library3.h>
#endif

and so on. The other thing to realise is that a const variable will obey the usual scoping rules for variables, this isn't the case for #define macros (which may be a help or a hinderence depending on the context).

It's really horses for courses. Have lots of tools of tools in the toolbox, and keep them all sharp. :wink:

Most of my code usually ends up being compiled on several platforms for one reason or another, and trying to maintain that without using the macro preprocessor would be very tiresome and error-prone.

I wonder if this would work if you set the compiler optimization off/to their lowest setting? If not, you would need to know which compiler options to set.

In any case, accurate dead code elimination is a bit trickier than accurate text preprocessing, so the preprocesssor may be "safer" in that regard. For example,

http://gcc.gnu.org/bugzilla/show_bug.cgi?id=42494

#defines are also used in "include guards", i.e. the usual .h prologue:

#ifndef _THISFILE_H
#define _THISFILE_H

declaration code

#endif

Also, they are a necessity in multiplatform code, when the hardware may offer different peripherals, for example:

#if defined(PROCESSOR_ABC)
// instructions that manipulate register TMR0
#elif defined (PROCESSR_XYZ)
// instructions that deal with register TMR3
#else
#error unsupported processor
#endif

pico:
I wonder if this would work if you set the compiler optimization off/to their lowest setting?

Yes. With optimizations turned off the dead code is eliminated. Oddly, a full stack frame is generated (which is dead code in my test case (but unrelated to the test)).

For example,
42494 – [4.4 Regression] Missed dead-code-elimination

I suspect crippling bugs with the preprocessor can also be found. :wink:

Of course, but my point is that it is inherently a more difficult thing to accurately identify and eliminate dead code than it is to process text with preprocessing directives. I would expect the dead code to be eliminated by a decent compiler, but without the same confidence I would have with the preprocessor.

It's starting to make sense to me, now! Thanks again for all the help. I think I need to find some good text books that cover the basics of micro-controllers and IDEs/programming. I feel like my knowledge of the Arduino (both the IDE and hardware) is analogous to most people's knowledge of their car. They know how to drive it reasonably well and they have a conceptual understanding of how their engine and transmission work but that's it; they can't fix it if it quits. That's how I am on the Arduino - a lot of it is obscured by a black box of ignorance!

The good news it it's C/C++ witha few additions, so you'll find plenty of documentation for the language fundamentals.