counter = counter++;

#include<LiquidCrystal.h>
LiquidCrystal lcd(12,11,5,4,3,2);

const int button(6);
int buttonState;
int counter;

void setup() {
  lcd.begin(16,2);
  pinMode(button, INPUT_PULLUP);
  lcd.print("The Counter is");
}

void loop() {
  buttonState = digitalRead(button);
  if (buttonState == LOW){
    counter = counter++;
    lcd.setCursor(0,1);
    lcd.print(counter);
    delay(20);
  }
}

What is the reason that the counter always stay at 0?

try

counter++;

instead.

The sequence is Make a copy of counter in nemory. Increment counter. Assign the copy back to counter.

Just use

counter++;

many thanks

The question was answered, but just to add some more information.

You should use

counter = counter + 1;

or

counter++;

The statement

counter = counter++;

is somewhat ambiguous, as counter is post-incremented. I think if you use the pre-increment form

counter = ++counter;

it should work as expected because counter is incremented before it is used (assigned in this case).

marco_c: The question was answered, but just to add some more information.

You should use

counter = counter + 1;

or

counter++;

The statement

counter = counter++;

is somewhat ambiguous, as counter is post-incremented. I think if you use the pre-increment form

counter = ++counter;

it should work as expected because counter is incremented before it is used (assigned in this case).

Nope. The result of ANY operation that modifies any variable on both sides of the expression will be undefined, and can behave differently on different compilers. Read up on "c sequence points". For example: c = c + 1; is fine c++; is fine ++c; is fine c = ++c; is undefined c = c++; is undefined if c is initially 1, the first 3 examples will ALWAYS assign 2 to c; But the last two, can assign either 1 or 2 to c, depending on the compiler. The results of these expressions are, per the c language specification, undefined, and should NEVER be used. There are many other such examples. Regards, Ray L.

RayLivingston: Nope. The result of ANY operation that modifies any variable on both sides of the expression will be undefined, and can behave differently on different compilers. Read up on "c sequence points". For example: c = c + 1; is fine c++; is fine ++c; is fine c = ++c; is undefined c = c++; is undefined if c is initially 1, the first 3 examples will ALWAYS assign 2 to c; But the last two, can assign either 1 or 2 to c, depending on the compiler. The results of these expressions are, per the c language specification, undefined, and should NEVER be used. There are many other such examples. Regards, Ray L.

A compiler is allowed to do anything if you invoke undefined behavior, such as format your hard drive or send nasty emails to your employer. Of course, most compilers won't actually do that, but I would rather not risk it....

christop:
A compiler is allowed to do anything if you invoke undefined behavior, such as format your hard drive or send nasty emails to your employer. Of course, most compilers won’t actually do that, but I would rather not risk it…

Oh, please! Show me ANYTHING in ANY actual c language specification that so much as even hints at anything as ridiculous as that! Or, alternatively, find me even a single compiler ever written that does anything so idiotic.
The specs make it clear the numeric result of the assignment is undefined. Period, end of story.
Regards,
Ray L.

RayLivingston: Oh, please! Show me ANYTHING in ANY actual c language specification that so much as even hints at anything as ridiculous as that!

I'm sure it doesn't exist. But the point isn't about the existence. The point he was making was that you could write a compiler that did this and you wouldn't be breaking any rules. You probably wouldn't sell many compilers, but you wouldn't be breaking the rules of C.

The problem arises when folks who are writing these compilers say, "I don't need to test that, it's undefined" and then it does something totally unexpected.

I believe the classic statement is that it could make tiny demons fly out from your nose and it would still be compliant.

RayLivingston: Oh, please! Show me ANYTHING in ANY actual c language specification that so much as even hints at anything as ridiculous as that! Or, alternatively, find me even a single compiler ever written that does anything so idiotic. The specs make it clear the numeric result of the assignment is undefined. Period, end of story. Regards, Ray L.

It's not just the numeric result of the assignment that is undefined. The whole program is undefined. See this thread for an example of how one invocation of undefined behavior (accessing an array past its last element) can affect something seemingly unrelated (comparing two ints).

We haven't verified yet that this is the problem on that thread. If you'll notice, the OP said that the code he posted on that thread isn't even the real code that produced that output. So we are waiting to see what really happened.

I doubt it is the case. Writing past the end of an array would corrupt memory. Reading past the end of an array should just create a bad reading.

And actually, reading or writing arrays is defined behavior. If you write to the third element of a two element array then it will write to the next memory location after the array. That may cause a strange error, but it will cause the same error every time as long as things are arranged in RAM the same way. It's different from the case on this thread in that two compilers could do completely different things. Very rarely this array behavior gets exploited by crafty programmers to write one variable through access to another. It's a top secret way to mess with private member fields as long as you know how they are arranged in memory. Remember, all that array syntax is doing is pointer math which is well defined.

Delta_G:
We haven’t verified yet that this is the problem on that thread. If you’ll notice, the OP said that the code he posted on that thread isn’t even the real code that produced that output. So we are waiting to see what really happened.

My understanding of that thread is that the code he posted is what produced the output in the first post. I might be wrong about that. But the program still invokes undefined behavior, and I wouldn’t be surprised to see that output because of it.

And actually, reading or writing arrays is defined behavior.

Reading from or writing to an array past its end is undefined behavior. It might do what you “expect” in most cases, but from the viewpoint of the standards and the compiler, it’s undefined behavior, and a compiler can do pretty much anything it wants to.

See Undefined behavior can result in time travel (among other things, but time travel is the funkiest) for a good analysis of a compiler optimizing away code in the face of this sort of undefined behavior.

Except that it is defined. If you write past the end of an array it will write to the next memory address after the array. The compiler isn't allowed to write wherever it wants. It is specified where that write will occur.

If you read past then it will read from the next memory address. It isn't allowed to read just anywhere, it is specified where it will read from.

Now that may not produce the results you expected. The results may surprise you. But it is a defined behavior.

I think you are confusing optimization with compilation. Two different optimizer methods may give two different results. But the compiler behavior there is well defined.

Delta_G, the C and C++ standards are very clear that accessing an array outside of its bounds is undefined behavior. I don’t know why you’re arguing otherwise.

From the C99 standard:

J.2 Undefined behavior

The behavior is undefined in the following circumstances:

– An array subscript is out of range, even if an object is apparently accessible with the
given subscript (as in the lvalue expression a[1][7] given the declaration int
a[4][5]) (6.5.6).

(C++ is substantially the same when it comes to undefined behavior.)

Since it is undefined behavior, the compiler most definitely is allowed to read or write wherever it wants (or do anything else that it wants, including but not limited to making demons fly out of your nose).

If you don’t believe me, look at Winner #1 at Undefined Behavior Consequences Contest Winners.

OK. Conceded. I was under the impression that these two would produce the same code after compilation. I must be wrong.

byte array[5] = {1,2,3,4,5};

array[3] = 6;

///vs.

*(array + 3) = 6;

I thought that was what the number in the brackets was supposed to do.

Delta_G: I was under the impression that these two would produce the same code after compilation. I must be wrong.

They are the same.

Section J.2 also lists these undefined behaviors:

— Addition or subtraction of a pointer into, or just beyond, an array object and an integer type produces a result that does not point into, or just beyond, the same array object (6.5.6).

— Addition or subtraction of a pointer into, or just beyond, an array object and an integer type produces a result that points just beyond the array object and is used as the operand of a unary * operator that is evaluated (6.5.6).

The first bullet means that the following code is illegal, and the compiler can do absolutely anything with it:

int array[5];

int *p = array + 6;

Yes, even the simple act of calculating a pointer to beyond the array object (except for "just beyond" the array) is undefined behavior.

In the context of the C99 standard, "just beyond" means one element past the end of the array (such as "array + 5" with an array of 5 elements). It's legal to calculate a pointer to that element, but it's not legal to dereference that pointer (see the second bullet).

Most of the time reading from beyond an array will just give you data from beyond the array (such as a function's return address or some other variable), but in some situations a compiler will outsmart you and completely change the meaning of the program and give you results that you didn't expect (such as turn a finite loop into an infinite loop). That's why undefined behavior is so difficult to detect--a program can appear to "work" one time and then suddenly stop working when you change other code, use a different compiler, update the compiler, use different compiler options, etc.

Slightly OT: I once did something like this:

struct node {
    struct node *next;
    // other data members omitted for clarity
};

struct node *x = 0;
struct node *y = (struct node *)&x;
y->next = malloc(sizeof *y->next);

It's dodgy, but it seemed to work because "y->next" should be the same as "*x". Then I had to port the software to a different platform (from ColdFire to Arm) and a newer version of GCC. My code broke because it depended on undefined behavior, and either the platform change or the compiler change exposed it.

Side note: In Arduino 1.0.x you would get an increment if you used "count = count++;" but in 1.6.x you get no action.

Also: "count += 1;" is another way to properly say "count = count + 1;"

RayLivingston: Oh, please! Show me ANYTHING in ANY actual c language specification that so much as even hints at anything as ridiculous as that!

I'm sure he was exaggerating the point for comedic and pedagogical effect. The lesson is: don't do it.

Delta_G, the C and C++ standards are very clear that accessing an array outside of its bounds is undefined behavior. I don't know why you're arguing otherwise.

That is, you can't assume that going out of bounds will access the 'next' item in memory. An example is page-addressed memory.

Back in the good old days of 16-bit x86, memory addresses had a two part format - a page (which was multiplied by 16), and an offset. if you have an array of 16 bytes which - as it happened - got placed at memory address 1234:FFF0, attempting to access foo[16] will make the computed offset roll over to zero.

The point being, it might do this, it might not. You don't know and you mustn't rely on it.

Having said that, people did rely on it and use it all the time. You would define a struct like this:

struct foo {
  int len;
  char varying[0];
}; 

struct foo *newFoo(int len) {
  struct foo *f = malloc(sizeof(struct foo) + len + 1);
  f->len = len;
  return f;
};

void someCode(char *s) {
  struct foo *f = newFoo(strlen(s));
  strcpy(f->varying, s);
}

And it would be safe to use f->varying. Why? Because although the language made it undefined, the defined behaviour of malloc would make it safe.

This is an example of why C lets the programmer do unsafe things. We are expected to be just a tiny bit smarter than the compiler. We know things about runtime behaviour that a compiler cannot know.

The people that work on gcc seem to enjoy changing the behavior of things that have "undefined" behavior, from one release to another. Maybe they have good excuses, somewhere off in the optimization of intermediate-representation pseudo-code. Or maybe it's how they get their jollies :-(

(Likewise, the people who work on the language standards every once in a while seem to do "let's get serious and DEFINE this formerly undefined thing. In a way that is incompatible with most existing implementations, if we can!")

I've been bitten a number of times... They're all out to get me, I tell you!