I've noticed an almost religious hatred of delay() amongst the experts in this forum. But the discussions in recent threads are a great example of why millis() timing isn't always the right tool for the job. The truth is, it's more complicated than delay() and, as is perfectly evident, harder to understand and easier to make mistakes, especially for beginners.
So I will make this assertion: it is ALWAYS better to use delay() UNLESS you need to do something else during that delay period*. Only then should you go the millis()-based multi-tasking route.
From an engineering point of view, there is no merit in making something more complicated than it needs to be. Indeed, it is an engineering sin to do so.
Just to be clear: I use millis() and micros() -based multitasking in most of my projects, but where the application is simple enough to allow it, I will always use delay().
Good engineering requires you use the right tool for the job, not the same tool for every job.
*There is another special case for avoiding delay() where timing offsets must not be allowed to accumulate, as discussed in another thread, but that's rare and beyond the scope of this thread.
My view is 'delays are evil'
However .. a lot if first time users find them much easier to understand.
The sketch with delays is linear. Do this then this then this then repeat.
But once the sketch gets more complicated and has 'states' and multiple interacting delays, then it needs to be re-written as using millis() (or some delay class, like my millisDelay).
In that case I find re-writing the sketch as a series of tasks help clarify how it is actually running when using millis() for delays.
As for special cases, I have noticed a number of posters come unstuck with Serial and delays when the delay causes the Serial RX buffer to overflow and loose input and they don't know what is happening particularly since is works for 'small' inputs and then suddenly fails as the input gets longer (>64)
Perhaps the Arduino dev team should put an artificial ceiling on delay() functionality, that only allows delay to be used for less than ‘say 500mS, which might prompt some users to look for other possibilities.
e.g. repetitive 100ms delays while they check for other events…
This an obvious gateway into using millis().
Harsh, but sometimes you need to be cruel to be kind !
How interesting - I agree with every word of that, and I don't think what you've written demonstrates a problem with delay(). Rather, you've confirmed that delay() is fine for simple, linear sketches, and millis() is the way to go for more complex applications.
When you write "My view is delay are evil", that sounds too dogmatic and religious, neither of which have any place in engineering. Engineering decisions are supposed to be logical and rational; there is no evil, there is simply suitable or unsuitable.
Many of my projects are fairly simple and I regularly use delay() when that's all that is required. I just don't talk about it in here. I don't want to get involved in religious discussions.
If you don't learn how to use millis() timing and in particular if you don't understand the difference between blocking and non-blocking code then sooner or later it will bite you. But for many of the people we meet in here who are often using some "found" code and who barely understand what 'if' and 'for' do there's plenty to learn without diving into the confusion of millis() timing. So later is often the right time to learn.
Different people react differently but I think that there is something to be said for learning by making mistakes. If you have created a problem by using delay() then the experience of changing the code to use millis() can, I believe, be very instructive and you are likely to remember the pain and not want to suffer it again
Lots of code blocks, including for loops (for example) unless steps are taken to avoid it. You are right: the blocking nature of delay() is factual, but for many simple applications it simply doesn't matter. So when people make sweeping statements like "delay() is bad" or "I hate delay()" then they are not speaking from the discipline that engineering gives us, but from dogmatism. Put simply, delay() is not always bad. It has its place.
I absolutely agree with every word of this. Understanding the difference between blocking and non-blocking code is essential. But that does not mean never using blocking code. Blocking code is just fine when you don't need the processor to do anything else during that interval.
No, bad analogy; amputation is always a last resort. Try to find an engineering metaphor so we can relate it to blocking vs. non-blocking.
How about this... Civil engineers use concrete and steel to build bridges. I don't think there has ever been a bridge built from plastic. But they don't say "plastic is evil" and insist that toothbrushes should be made from concrete. Plastic has it's place, concrete has its place. Neither is evil.
That's why I brought it up. Religion has no place in engineering, so when you read things like "delay() is bad" it sounds awfully like the kind of faith-based, dogmatic thinking that religion brings.
I'm just arguing to keep religion (or perhaps "dogmatism" would be a better word) out of the advice we give to newbies. Right now we aren't doing that.
I totaly agree. Importantly, it teaches you why, and under what circumstances, delay() is the wrong tool for the job. Just telling newbies to always use millis() doesn't do that.
The only problem that arises is that 'programmers who love programming' cannot leave good enough alone and soon the simple, linear sketch, becomes a multi-delay mess.
I would hire a programmer that hates programming over one that loves it any day.
You give a task like 'code a method that adds to integers' to a programmer that loves programming and at the end of the week you get back a multi-headed library that handles, short, int, long, long long, float and double and complex numbers. No test cases, No documentation.
That example is a little extreme, but I actually had a case like that. More than the spec asked for but no test cases and no doc.