Go Down

Topic: How does define work? (Read 591 times) previous topic - next topic

GekoCH

Hy

I got a strange behaviour when I use this code:

Code: [Select]
#define alt  10000


void setup(){
  Serial.begin(115200);

  Serial.println(alt);

  if (55000 >= (alt * 100)){
    Serial.println("Yes");
  }
  else{
    Serial.println("no");
  }
}


void loop(){ }



Why Is the Output yes? Since 10000 * 100 is greater then 5500??

Whe I use following code it works:
Instead of *100 I use *100.


Code: [Select]
#define alt  10000


void setup(){
  Serial.begin(115200);

  Serial.println(alt);

  if (55000 >= (alt * 100.0)){
    Serial.println("Yes");
  }
  else{
    Serial.println("no");
  }
}


void loop(){ }




I don't get it!!

Andy

AWOL

#1
Mar 04, 2013, 11:23 am Last Edit: Mar 04, 2013, 11:26 am by AWOL Reason: 1
Quote
Since 10000 * 100 is greater then 5500?

No, it isn't - it is less than zero. (16 bit arithmetic, remember)

Edit: Sorry, not less that zero, but still less than 55000 (16960, I think)
"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.
Do not send technical questions via personal messaging - they will be ignored.

majenko

#2
Mar 04, 2013, 11:26 am Last Edit: Mar 04, 2013, 11:32 am by majenko Reason: 1
Try forcing an integer size:

Code: [Select]

#define alt  10000UL


That will define "alt" as being an unsigned long value of 10,000.

This is why it is generally better to use "const" for numeric values instead of #define as you specify a data type for the compiler to work with:

Code: [Select]

const unsigned long alt = 10000;

Get 10% off all 4D Systems TFT screens this month: use discount code MAJENKO10

AWOL

Quote
Personal Message (Online)
   
   
Re: How does define work?
« Reply #2 on: Today at 11:26:34 »
   Bigger Bigger Smaller Smaller Reset Reset Reply with quoteQuote Modify messageModify Remove messageRemove Split TopicSplit Topic
Try forcing an integer size:

Code:

#define alt  10000UL


That will define "alt" as being an unsigned long value of 10,000.

This is why it is generally better to use "const" for numeric values instead of #define as you specify a data type for the compiler to work with:

Code:

const unsigned long alt 10000;

But slip in an assignment, to keep the compiler happy.
"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.
Do not send technical questions via personal messaging - they will be ignored.

majenko


Quote from: Majenko

const unsigned long alt 10000;

But slip in an assignment, to keep the compiler happy.

I haven't finished my first coffee of the day yet.
Get 10% off all 4D Systems TFT screens this month: use discount code MAJENKO10

GekoCH

wow thx a lot

so const is similair to define you still can't change the value after it is "defined" in the code?

Andy

majenko


wow thx a lot

so const is similair to define you still can't change the value after it is "defined" in the code?

Andy

That is correct.  Once compiled the end result is the same - the number is just a number stored in progmem in the same way as #define - the resultant code is (usually) exactly the same.
Get 10% off all 4D Systems TFT screens this month: use discount code MAJENKO10

PaulS

Quote
so const is similair to define you still can't change the value after it is "defined" in the code?

Vaguely similar. The advantage, as you've seen is that the value part has a type associated with it, unlike the value part of a #define statement. That type lets the compiler point out more mistakes you might make, and lets the compiler generate correct code when the code to be generated depends on the type (as in math operations).

GekoCH

thx a lot to all of you explaining me the problem and the difference among the const and define statement!!!

Andy

Go Up