#define, const int or int

Hello, I kinda have an idea of the differences of the three, but I just need someone to affirm it.

As I understand, #define is a marco, which means is I declare #define XYZ 10 it doesn't take up the precious memory space on the Arduino. but if I declare const int xyz=10; or int xyz=10; This takes up memory space.

That leads me to think in programming the Arduino, whenever I need a constant, it's better to use #define rather than int or const int. Am I correct? Do I understand correct?

Also if I have many constants, is there a limit on how many #define I can have?

Thanks in advance.

int xyz=10; // declares a variable and initializes it (in RAM);

const int xyz=10; // declares a variable and initializes it (in RAM); you'll get a compilation error if your code tries to modify it;

#define XYZ 10 // compiler will replace XYZ with 10 everywhere in your code; the effect is on the program memory;

I'm interested to know as well, but from what I've read it doesn't matter in the IDE between define and const. But it depends on the compiler you're using.

Int will take up sram though

This one has been discussed to death here on the forum in the past.
Please search for the various threads.

This kind of answer is completely un-helpful!

Really? You actually expect a moderator to run the Google query for you? You don't even need to type. You could have just copied the subject of your own post...

https://www.google.com/search?q=%23define%2C+const+int+or+int+site:forum.arduino.cc

if I declare const int xyz=10; or int xyz=10; This takes up memory space.

That leads me to think in programming the Arduino, whenever I need a constant, it's better to use #define rather than int or const int. Am I correct? Do I understand correct?

No.

const int xyz=10; // declares a variable and initializes it (in RAM); you'll get a compilation error if your code tries to modify it;

No.

Find the many posts about this on this forum. We discussed it at great length. I added this post in case anyone who reads this thread later thinks the assertions above are true.

@Qdeathstar, after scouring the search results, feels there is not a consensus. So, I will provide one.

For simple constants, like the example provided by @rich1812, always use a typed constant. Always.

For simple constants, like the example provided by @rich1812, do not use #define. Never.

This is correct...

const int xyz=10;

It is free. It is type-safe. It has no side effects.

To summarize, for simple constants, always use a typed constant. Always.