The merits of delay() vs millis()

A really good reason that experienced devs never say their code is ‘bug free’ ! :hugs::rainbow::coffee::coffee:

But this might be the last chance to get it.

:sob:

1 Like

After losing a multi-million dollar company, I don’t trust anyone completely - ever :scream:

When I started there was how only trivial code will never crash. The last data collector I wrote for money ran for 6 months at a time before the owner clicked it off for IDKW and then restarted for another 6 months. I had been writing code for almost 20 years when I did that one, between techniques and fault-tolerant coding and running on DRDOS instead of flakey Winblows, it had no substantial bugs. It was a system itself written around a 1-pass compiler that added log lines with pre-analysis text.

The stuff I show does not take a genius to get. It's the EE approach as opposed to the IT approach. People with training and success in IT have a hard time getting around EE code because deep down, IT is the "right" way and finishing coloring in a block (processing a step) before the next begins is like a compulsion.

I'd rather have one task control another than to write the function and control into the same block of code. I save lines that way, about half.

Consider that I've had years of practice when I tell that practice makes you better.

this and your second post sounds for me like:

"all others are using millis wrong, only I do it properly. But I can't show it because my code is confidential."

this sumarizes to "I know it better but I don't tell you." - so what's the point you want to tell us?

At least you should give a "proper" code example to underline your development skills.

1 Like

2 posts were split to a new topic: Writing to NOR flash memory

The big difference is

--- one way, most functions are written to run once, start to end if possible.
--- this way uses execution blocking to keep the process in step, like chess moves.

--- the other way, most functions are written to run often doing the job in small steps.
--- this way, inputs get read regularly, processes only wait on their own need
--- it works out like an RTS with everything on its own coordinated path.

I'know people who quit school saying they knew enough already, book learnin's useless and every little argument to protect their stupid. They live in a cross between the real world with things like TV and their delusions of life, right and wrong in 3000BC.

If it helps, I turned 30 in 1986 still working out better ways to code.

Is that why for all these years the forum gurus teach newbies to write non-blocking code when their "good enough" fails miserably?

Biggest reason maybe for hatred of delay() is the time spent digging newbies out of that RUT they made so deep they have to Unlearn before they can improve.

Those who ONLY know delay()... NEVER know when it's the wrong choice.

1 Like

The implication that the average beginner is so dull-witted that, once they've discovered how to use delay(), they're too dumb to proceed on to millis() and FSMs seems a little unfair, to say the least.

Of course its unfair! Its not the newbies fault they hit the dead end that is dealy(). Its the people telling them its alright to use it that's at fault.

-jim lee

But it IS alright SOMETIMES.

I see two implications here, both deliberately misleading.

See what this this implies: Not Knowing Something Does Not Make Anyone Dumb.

If you ever checked out Nick Gammon's lessons on the subjects you mention, you would know that they are written at beginner level.

PERHAPS it is better for the beginners to learn better so that they can make a real choice on whether the code they spend days on isn't down-a-dead end that can only be pushed through increasing application of tedium ---- a huge waste of time.

A > Well, there you are beginner, the light blinks and lesson is delay is all you need for timing.

B > Yeah! Now I want to add a button to turn the blinking off and on the instant I press the button! How do I delay both?

A > Well, you CAN'T with that delay() in your sketch............ you never said the code would have to do anything more.

Delay is appropriate for Do-One-Thing-At-A-Time code only. Beginners who DON'T KNOW BETTER can spend Years doing just that with bigger and bigger code. I have seen dimwits make IT careers around that and my understanding is that they started out bright and just wore down making cheese-code.

I regard beginners as students. I refuse to teach self-limiting, time-wasting code.
It's the UN-learn so you can learn part that's the worst, it's harder to get out of the rut the longer it's been grooved into your head via neural connections made in the RUT.

At least teach the noob that delay() is a code-limiter that an engineer can know when it's appropriate to use.

Moving from using delay() in a sketch where it is appropriate, such as Blink, to one using millis() to allow the blinking to be stopped at any time does not involve any un-learning. Rather it involves learning the appropriate technique to meet the requirements.

Which probably requires a change in attitude in those of us helping newbies.

"Ok, so you learned to attach bits of wood to each other by bashing in the sharp metal things we call 'nails' with the thing we call a 'hammer', and that's all well and good, but if you want to use the similarly shaped pointy things we call 'screws' then a hammer is the wrong tool for the job, now you need to use a 'screwdriver'."
"No, you don't hit the screws with the screwdriver, you need to learn a completely different technique".
None of which invalidates the use of a nail and a hammer (or as one of my metal work teachers used to annoyingly sing "Just give me a nail and a hammer, and a picture to hang on the wall").

Yes, that's very much my view. There are certain circumstances where it is perfectly OK, indeed ideal, to use delay(). Other circumstances require millis() and an FSM. There is no merit to teaching a newbie to ignore delay() and skip straight to millis() and FSMs. We should teach them both, and teach them when each is most appropriate.

Exactly! Our job is to teach the student to think about the difference between Do-One-Thing-At-A-Time applications and those which require a degree of multi-tasking, and then choose the most appropriate architecture.

Perhaps you should be a bit more flexible. We should teach the most appropriate method for the requirements. Sometimes time-wasting code is fine, often it is not. Just teach both. The concepts are so trivial that any normal newbie will be able to learn both. Let's not make too much of the "un-learn" thing - it's not a real issue in my experience, and in any case it isn't really un-learning, it's more learning.

You have hit the nail on the head :grinning:

On many occasions when replying to problems in the forum I have been aware that to fix a problem a complete restructuring of the sketch would be the best solution but I am wary of jumping in and suggesting the use of millis() for timing, detecting a state change instead of the current state, using an array or using an FSM machine which are so often the solution

An FSM is way to much for many beginners to take on so I like to take an incremental approach, so for arrays, for instance, I have sometimes deliberately suggested using variable names with a numeric element so that the change to using an array later is less of a jolt

One problem with providing help on the forum is that there are often others suggesting changes and the OP finds it difficult to follow any of it. In such cases I usually back off and let others get on with it.

Some of my most successful interactions with users have been when we switched to PMs or even email in order to have no interruptions, however useful they may have been.

No doubt others feel the same about my helpful suggestions on occasion too !

Yes, me too.
Sometimes I make a suggestion but it is clear the person asking is more attuned to answers that they are getting from other people, so I let them get on with it. Sometimes my answer is the best one for the person asking, sometimes it misses their point.

Entering my 3rd year writing for these micro controllers and C++. I did a tutorial which used delay() quite a bit. Then I found out delay() bad millis() good. I asked why and learned why. OK So code without using delay() cool millis() works. Then I got a Due where I got into uMT a RTOS that runs on the Due. Then I got into ESP32's with freeRTOS.

Having delay() allow for a level of learning that is offered but there are better ways. Delay() has its place and so does millis() and so does vTaskDelay(). Each one gets better. Each one offers a path for greater knowledge.

Each one gets better offers different features to meet different requirements