The merits of delay() vs millis()

IMHO.
I have been there AND here!
When I started programming I always preferred simplicity. The only time I went next level was when I was pushed to a corner.
I was ALWAYS using DELAY. But have since moved on to using timers.
Using timers really seems scary at first but it is absolutely not. Just learn a few basics and you are good to go.
Rule of thumb:

  1. If your code is going to remain as is with single task etc. and does not require any background stuff - feel free to use delay().
  2. If want to write code that is dynamic, expandable in the future, OR have more than just one thing running at once. Save yourself trouble down the line. Get out of the sandbox and learn something new.

Well, you only have to read this forum! It was a post from a newbie who'd got his/her knickers in a twist trying to use a millis() timer without understanding what they were trying to do, or why, because one of our "experts" had declared "delay() is evil".

That level of religious-like dogmatism is what made me start this thread, because I am certain it has no place in the world of engineering. There is NEVER room for dogmatism in engineering. There is never anything "evil" in engineering. There is only "appropriate" or "inappropriate", "suitable" or "unsuitable".

Yes! We agree! Delay() DOES have a valid role, just as you describe in point 1. Millis() timing also has a valid role, as in point 2. Horses for courses. Good engineering throughout.

Yes, but that's part of the learning process; once you know how to create a simple millis() based timer they are simple. Not risking making a mess of a millis() based timer means never learning how to use them. We are here to help when people make a mess of it. I don't mind helping people who are trying and getting it wrong, the ones I don't like are the ones not interested in trying.

1 Like

Current books are "Far Inside The Arduino" and "Far Inside The Arduino: Nano Every Supplement" available at Amazon, printed or as Kindle books. I also have a blog that discusses progress on the next book at https://tomalmy.com.

1 Like

I've never suggested "not risking making a mess of a millis() based timer".

I've suggested repeatedly that delay() is the correct solution when there is nothing else needs doing, and millis() when there is (or will be in due course).

I think people should be competent with both, and use whichever is the most appropriate for the job.

Amen.
Now, explain that to the next noob who wants to know all about interrupts.

1 Like

Thanks, Tom - both books purchased. :grinning:

Nah, don’t be so hard in yourself.

a7

1 Like

Or, or.. You just come up with a bag of tricks you use and trust that takes you BEYOND all this mess. (Yes debugged and reliable)

Programming should NOT be learning code, but teaching the computer YOUR language. This is what libraries are all about. Don't like using delay()? No problem. Make NOT using it a no brainer with your bag if tricks.

-jim lee

don't delay(); use millis(); today

3 Likes

I never need to waste cycles. Whatever I write as a task is compatible all non-blocking code.

When I was little, I crawled. Since learning to walk, I've only crawled when necessary.

1 Like

Programming should NOT be learning code, but teaching the computer YOUR language.

I write C in the Arduino IDE. That gets translated to machine code that I don't write.

Programming language is the programmers interface to the machine, to make instructions it follows to do the steps it is able.

I write in my own style using collected working techniques including un-delay.

If I ever make a drone, it should be a gyroplane.

And what are we drinking tonight ?

1 Like

The compiler translates my source to hex. Not in MY LANGUAGE but what STYLE I use, non-blocking, event-driven tasking code done more simply than explained. : - D

Of course you do! You're saying every program you write requires 100% CPU utilisation? I don't think so.

Every real-world application involves the CPU spinning its wheels somewhere, waiting for something to happen, whether it's a Windows/Linux desktop, a device running an RTOS, or an Arduino flashing an LED. It makes not one jot of difference to the CPU whether its whizzing round inside delay() or whizzing round inside loop(). In both cases it's killing time until some condition is met.

If the only condition you care about is the expiration of a certain number of milliseconds, use delay(). If there are multiple conditions, especially if they are unpredictable, use millis(). Neither is "better", neither is "evil".

2 Likes

No, he said he does not need to waste them.

We are into semantics here, what does '100% CPU utilisation' mean? The CPU doesn't care, it asks the memory for the next instruction, gets the instruction, does want it says and asks for the next instruction. Whether those instruction do anything YOU think is useful or not is above the CPU's pay grade. Whether it's waiting in a delay loop or checking to see if there is anything more useful to do and mostly finding there isn't is of no concern to the CPU.

Yes, exactly, and honestly I think we all understand the point. "Utilisation" and "waste" can both be contentious terms and mean more to humans than CPUs.

My point remains: sitting in delay() and sitting in loop() are all the same to the CPU: in both cases it is waiting for some condition to be met. Which one we use depends on what condition(s) we are waiting for.

Neither is "better", neither is "evil", they are simply appropriate or inappropriate for the job.

2 Likes

I think inappropriate is worse than appropriate.

Leading people down a path that forces them to rewrite their whole code
when it has to do more than one thing is evil.

1 Like