const int v. #define

I use

#define maxCount 50

a lot for constants to minimize the amount RAM used. But in a lot of the sample code I see

const int maxCount=50;

for constants. Is there an advantage one way or the other?

Plenty of threads on this subject on the forum.

Compilers should be able to optimize the const int though. The biggest advantage of const int is that it obeys C scopes. Not that big of a deal if you have a small project but may cause some name clashes in bigger projects.

#define is a compile time directive and consumes no memory.
For the int datatype, the compiler may create an actual integer using Arduino memory.

In modern usage on a pc or larger computer, #define is somewhat deprecated since int expressions can be type checked and the memory issue is negligible.

On a 328 chip, in theory you could run out of memory, but that would take an unlikely large number of 'const int'

Well, I'm down to 500 free bytes of RAM on my 328, so very byte is precious; I've taken to using bytes rather than ints where I can to save RAM. If "const int" uses RAM I'll stick with #define and make sure I don't goof up.

cont int is optimized to an immediate value, so there is no memory difference between that and #define.

cptdondo:
Is there an advantage one way or the other?

The second way has several advantages and no disadvantages. There was a time when programmers had to use #define because there were no other ways, but those days have passed.