Discussion of Delay

So...

why not replace delay with something involving millis() to reduce the "why isn't my interrupt executing, etc."

baum

There is a fundamental difference in code designed with delay() and with millis().

millis() isn't a blocking function. It is a ticker. A state machine needs to be implemented so that actions occur based on the number of ticks.

However, in the context of your original question, delay() doesn't interfere with interrupts.

o :(. Then why is it bad? (this was a continuation of "replacing goto")

Then why is it bad?

Its neither good or bad. Its a tool, and like any other tool needs to be used properly.

It's when someone is using delay(3600000UL) and wonders why the Arduino doesn't respond to switch presses that it's bad.

baum:
Then why is it bad? (this was a continuation of "replacing goto")

"Good" and "bad" will always carry the qualifier "depends". The use of delay() is discouraged because it usually becomes very limiting as a project moves forward. As I already said, the implementation of a delay()-based sketch and a millis()-based sketch if fundamentally different.

Same story with goto. It might seemingly solve a simple problem, but as a program grows it becomes a limiting factor.

baum:
Then why is it bad? (this was a continuation of "replacing goto")

It's not bad per se.

If you find a switch has been pressed, and want to briefly wait (eg. 10 mS) to debounce, then using delay (10) is simpler than writing somewhat more complex code to remember the switch was pressed, and detect that sometime later in the main loop.

Its been well said above. I concur.

--
And now I am inspired: The next post where the confused user doesn't fathom why the delay() is ruining his logic, the answer could be

Replace every "delay(x)" with "for (int tick=0; tick<16000*x; tick++) /*wait here*/ ;" Now you may understand why your program will hang/pause/do-nothing.

(yes, the compiler may optimize the loop away, but it is a mental exercise, to emphasize that delay() will eat up CPU cycles and not do anything else)

Or another substitution: "start=millis() ; while (millis() - start < x) )/*Wait here*/ ;"

for (int tick=0; tick<16000*x; tick++) /wait here/ ;

Is one tick of a for loop equal to one clock cycle? In assembly, wouldn't it take a three or so? (One to check variable, one to increment, one to jump)

baum

One could limit delay(), to a less absurd interval of say 1000 miliseconds.
The problem is that this is not the correct way of writing software, people should know what it does and then decide what to do how to use each function, macro, class, etc... This is probably one of the weakest points of Arduino, since it does not make everything as easy as people expect it to be. But could be worse.

Same with goto, if used right, can be a pretty useful command even in huge programs. Linus Torvalds uses ir extensively in the Linux kernel. An Interview With Linus Torvalds: Linux and Git - Part 1 | Tag1 Consulting

baum:

for (int tick=0; tick<16000*x; tick++) /wait here/ ;

Is one tick of a for loop equal to one clock cycle? In assembly, wouldn't it take a three or so? (One to check variable, one to increment, one to jump)

That is why it is called "tick" and not "clockCycles". A tick is arbitrary.

Actually, delay() in Arduino is tied to the same underlying mechanism as millis(), that is, it uses the timer. A delay() actually translates into waiting until the timer counters reach the calculated values for the delay.

There is a _delay_ms() AVR C function which executes a calculated number of instructions that will allow a certain amount of time to pass.

Either is a valid method, but the resulting code bloats in size if you try to use "variable" values with _delay_ms().

The Arduino solution is much better for Arduino users and is actually quite nice.

If you do not want interrupts to affect your delay()s then use _delay_ms(), just make sure to use static values or loop over _delay_ms(1) the specified number of times if you need variable times.