I'm an adult educator by default and then post-grad study, so I'm always pondering the desirability of including stuff in training and if so, to what depth how soon. One of the first programming courses I went on, adopted a silo approach and taught each command in depth although a working program needed width (ie knowledge of many commands) before depth (a knowledge of each command's esoteric parameters). Since then, it's been interesting for me to see how curriculum requirements are a)defined and then b) sequenced into the material.
This thread raises the issue of the desirability of ever (never?) learning the use of delay. If it is desirable to learn it, should it go before the Blink Without Delay (BWD) millis() approach is learned, or what?
I suppose one school of thought is never to use delay, although a 5ms delay to let an eeprom write finish is probably a legitimate use of delay(). If your program does literally nothing but flash an LED for 10 seconds every hour, is delay() legitimate then, or is it just being lazy? It would certainly be difficult to undo that if the program requirements ever changed.
If delay() is learned first, is there a risk of the "with delay()" mindset getting too entrenched so that when it is definitely inappropriate to use it, it's difficult to learn the BWD / state machine approach?
Edit... a bit of background. The delay() function halts all processor activity while the delay is, well, delaying. So if you delay for 10 seconds after switching an led on, to switch it off again, your code can do nothing else. Using millis() however, the program would switch the led on, and carry on doing stuff. Among the stuff it will do, is look every-now-and-then to see if 10 seconds has expired. If not, carry on with other stuff. If it has, turn the led off, then carry on with other stuff.