c code "char" explination / clarification please

So, I am trying to learn C (know nothing about C or any other language … last programing I did was back in the 80’s using basic).

Just a few lessons (Youtube) in.

They are just starting to talk about defining variables.

In the example (which worked) …

char characterName = “John”;
int characterAge = 35;

OK, I read about data types … char and int make sense.

I just don’t get the two square brackets. I tried to read … got confused … they started talking about array and null characters.

So why the closed square brackets?

Is that “just the way it is done” or ???

Just want to make sure I understand the “shy” before I go on to the next lesson.

Thanks!

PS … here was the entire example program …

#include <stdio.h>
#include <stdlib.h>

int main()
{
char characterName = “John”;
int characterAge = 35;
printf(“There once was a man named %s\n”, characterName);
printf(“he was %d years old.\n”, characterAge);
printf(“He really liked the name %s\n”, characterName);
printf(“but did not like being %d.\n”, characterAge);
return 0;
}

Read the documentation on the 'array' data type in the reference section of this site.

char characterName = "John";

John is 5 characters, the 4 obvious ones plus an invisible one at the end called a null terminator, zero basically, which serves to show where the end is. Using like that tells the compiler to reserve as many bytes as necessary (5 in this case) to hold the text.

int aNum; Means one number called aNum. You got that already.
int nums[5]; Means five ints in a row. The set is called nums each one can be accessed by an index.. or, offset from the original.

nums[0] zero offset
nums[1] first offset
nums[2] second. etc.

nums[3] = aNum; Read what's in aNum and store it into the 3rd offset of nums.

Its a way of storing a list of things that are the same.

Strings are lists of chars hence..

char myStr = "A string of chars"; This is an automatic way of doing..

char myStr[18]; list of 18 chars 1 + the visible string.
strcpy(myStr,"A string of chars"); a function that will copy a string into an list of chars.

Hope this helps.

-jim lee

jimLee:
Strings are lists of chars hence…

…only for the very loosest of definitions of “list”

You guys are awesome !!!!!!!!!!!

Exactly the information I was looking for !!!!

I just hate to move on without completely understanding how something works.

Thanks so much again !

Mike

TheMemberFormerlyKnownAsAWOL:
..only for the very loosest of definitions of "list"

Hey! I'm talkin' human, not code. Don't start confusing the issue.

-jim lee

Hey! It's all Greek to me :astonished:

Maybe you can teach an old dog new tricks ... lets hope so.

Thanks again!

Mike

Xtal- Please try to not use the String class. An awful lot of example sketches do, but you will be years ahead in programming skills if you use char arrays instead.

No:
String xyz="xyzzy";

Yes:
char xyz = "xyzzy";

Both will work in your sketch, but Strings will eventually crash your Arduino due to the inefficient way that String class uses memory.

Thanks for the heads up!

I am making notes of all these little items.

I really want to learn the "right" way to program.

I already ran into a problem like this. I found three examples of a program to blink an LED. All three defined the pins using different data types.

They all worked but there had to be a reason to choose one over. One used #define ... one used int ... and then I was told I should use const byte (hope this info is correct)?

Thanks !!!!!!!!!!

Mike

Oh, well while we're sitting around the fire spinning code yarns..

Beware of delay().

delay() is the deceiver! It promises to help but sucks the lifeblood from your processor.

-jim lee

xtal_01:
They all worked but there had to be a reason to choose one over. One used #define ... one used int ... and then I was told I should use const byte (hope this info is correct)?

iirc, #define is a preprocessing macro. It basically tells the compiler, "Hey, whenever you see this thing, replace it with this instead."

const int / const byte is exactly that, a constant. It's a variable that never changes. About the int and byte, int datatype is 16 bit long, while byte is 8 bit long, so if you want to save space, using byte with small numbers usually works.

As for the reasons, it's mostly preferences, tbh. There are some small quirks of each method, for example, you can actually redefine and undefine the definitions you make whenever you want, but you cannot reassign value to a const after it's declared. Also another one that I'm aware of, const int / const byte theoritically consumes RAM, because it is a variable, but most modern compilers are smart enough to leave const variables out of RAM space (not sure about Arduino's compiler).

For most purposes though, it's preference.

linearity64:
About the int and byte, int datatype is sometimes 16 bit long, while byte is 8 bit long, so if you want to save space, using byte with small numbers usually works.

Awesome!

I am sure knowing all these little quarks is what separates someone who can write code from someone who writes good code and from someone who writes great code.

I want to try learning the correct way to code ... saves having to break bad habits later :slight_smile:

Thank again for all the info !!!!!!!!!!

Mike

A variable declared as 'const' is type safe because it also has a type. A #define has no type and so the compiler can't apply checks to see if it's being used correctly. There are "replacement traps" (unintended syntax breakage) that plague #define's that can't happen with a const variable. So it's more than preference, IMHO.

Agreed. I found out the hard way that #define doesn't respect scope. Seems obvious now but it wasn't obvious when I was first learning C.

aarg:
A variable declared as 'const' is type safe because it also has a type. A #define has no type and so the compiler can't apply checks to see if it's being used correctly. There are "replacement traps" (unintended syntax breakage) that plague #define's that can't happen with a const variable. So it's more than preference, IMHO.

+1. Learnt this the hard way (semicolons after each line of #define, lol). I think a lot of confusion with #define (at least in my experience) comes from how people think of it as a way to declare a variable, while it's actually just a "find and replace" macro.
Also, talking about safety, I found out that one can actually call functions with #define. Which is neat, but looks like the epitome of error prone to me, lmao.