Why is the "delay()" function so widely used?

I understand that it is easy to implement and it doesn't matter for simple programs that just blink an LED, but for the sake of teaching new people proper techniques I don't see why it would be that hard to just use millis. I think the delay function just serves to confuse new people until they run into problems later down the line from using it. Are there really any uses for halting your entire program when millis could do the same job (with only 2-3 lines more code)? Maybe I am just being nit picky but as someone who is new to arduino I was very surprised to see how widely used delay is.

It encourages efficiency. ;)

It is widely used because it is easy to use. Its name is very descriptive of what it does, the parameter is easy to change and the result of changing it can be seen at once. The Blink program is a perfect example of this. Change the delay() parameter and the blink rate changes. It is a pity that the example does not emphasis in its comments that delay(1000); really means dontDoAnythingElseForThisNumberOfMilliseconds(1000);

It is important to get results quickly when learning and the delay() function provides for that. Of course it has its downsides, but a beginner has enough problems initially getting program logic and flow right and getting it to compile without the introduction of millis() timing at an early stage.

Imho Paul Stoffregens elapsedmillis library is a much better solution that is almaost as simple to use and does not have the drawbacks of the delay() function. It should be standard in the arduino IDE http://playground.arduino.cc/Code/ElapsedMillis

No matter how many better ways there are to time events on the Arduino the only way to prevent the delay() function being used is to remove it and that won't happen.

Making a library available would be a step in the right direction but it would simply mean that the standard "Look at the BlinkWithoutDelay example in the IDE" response would become "Look at the elapsedMillis() library in the Playground".

I am afraid that whilst the delay() function exists it will be used.

Not everybody that plays with Arduino's is doing it to learn and become an efficient programmer. Some just have a project they want to get done, and delay() is a perfectly acceptable way of accomplishing a lot of them.

Arrch: Not everybody that plays with Arduino's is doing it to learn and become an efficient programmer. Some just have a project they want to get done, and delay() is a perfectly acceptable way of accomplishing a lot of them.

That is exactly the problem. Once you have gotten a habit you are stuck with it. I wonder how many programs there are out using a bubble sort because it was the method shown in most books...

Which is perfectly fine for small sorts. Better a correctly implemented bubble sort than a broken shell sort.

The use of delay() isn't bad, as long as your code or project is fine with it. Once your project needs something smarter it's time to look for other delay options.

Thats called learning...

The delay() is bad. It is blocking code. It is a bad habit. It is oversimplification. It creates far more problems than it solves. And this is especially true since there are better options. Once you learned something there is no way to "unlearn" it. The only thing you can do is replace knowledge with other and hopefully better knowledge. But that is a hard process. Do it the right way from the beginning

UKHeliBob: No matter how many better ways there are to time events on the Arduino the only way to prevent the delay() function being used is to remove it and that won't happen.

Making a library available would be a step in the right direction but it would simply mean that the standard "Look at the BlinkWithoutDelay example in the IDE" response would become "Look at the elapsedMillis() library in the Playground".

I am afraid that whilst the delay() function exists it will be used.

Of course I wouldn't suggest removing functionality, there will always people who want to use it and will use it. But I think you underestimate the impact that would come from simply making that small change to the basic tutorials. By changing the delay example to use a millis timer library in conjunction with a small comment/explanation of how the timer works (comparing the current millis value every cycle to the original set millis value, or something similar), you are introducing early on the proper way to use timers which is a staple of almost everything you will do in embedded. When I was first introduced to embedded it was a huge revelation the day I realized that the program is so cyclical and that your logic often needs to branch across multiple cycles. It's obvious now but it wasn't always. Utilizing a proper timer would be a simple first step towards this realization for a new programmer.

Arrch: Not everybody that plays with Arduino's is doing it to learn and become an efficient programmer. Some just have a project they want to get done, and delay() is a perfectly acceptable way of accomplishing a lot of them.

I agree completely, however my understanding was that the original motivation of the Arduino project was to teach the basics of programming and electronics, which includes good coding techniques. With that in mind wouldn't it be more important to cater the documentation to the original audience, those wanting to learn, rather than the occasional person who just wants to bang out a quick simple project while learning as little as possible? Again i'm not suggesting removing the delay functionality, just that shifting away from using it in tutorials and documentation would help those new to programming and prevent problems later down the line.

C-F-K: The use of delay() isn't bad, as long as your code or project is fine with it. Once your project needs something smarter it's time to look for other delay options.

Thats called learning...

but the delay function isn't really useful in any other way. Once you learn and use the delay function you don't build anything on top of that knowledge. Use the millis timer is an integral concept that is used for anything beyond a simple blinky light program and vital to creating efficient programs. Not to mention it introduces a new programmer to the idea that their program is a rapid constant loop that you don't want to interrupt unless you absolutely have to. It really would be ok working as you described if they were first taught delay, and then everything beyond that teaches millis so they can learn and improve, but thats not the case. So many things use delay and there is never really an obvious clear explanation as to the differences and why one is better than the other. It's just assumed they'll figure it out at some point once their program stops working. And of course thats when you get a forum full of posts for something simple that they should have learned from the beginning.

Once you learn and use the delay function you don't build anything on top of that knowledge.

You could substitute the name of practically any Arduino specific function in that sentence. For instance

Once you learn and use the digitalWrite function you don't build anything on top of that knowledge. Does that sound reasonable ?

It really would be ok working as you described if they were first taught delay, and then everything beyond that teaches millis

I agree, but how many people using the Arduino are actually being taught rather than just learning about it for themselves.

UKHeliBob:

Once you learn and use the delay function you don't build anything on top of that knowledge.

You could substitute the name of practically any Arduino specific function in that sentence. For instance

Once you learn and use the digitalWrite function you don't build anything on top of that knowledge. Does that sound reasonable ?

It really would be ok working as you described if they were first taught delay, and then everything beyond that teaches millis

I agree, but how many people using the Arduino are actually being taught rather than just learning about it for themselves.

Yes but digitalWrite is actually the proper thing to use. There are no cases that I can think of where it would be better to use delay instead of a millis counter, besides for the sole reason of ease of use. Even if you aren't at an execution time limit for your application delay still isn't the proper way to do it. And yes people are learning for themselves, but that is exactly my point. Example code is the new users lifeline to progress and learning. This is extremely obvious when you look at half the code posted here where the comments and variable names have been carried over from example code. Whatever they see in the code examples is what they will use and that can get them into trouble when they start combining example code with other things, at which point your harmless delay example can cripple their more complex code.

m4778: There are no cases that I can think of where it would be better to use delay instead of a millis counter, besides for the sole reason of ease of use. Even if you aren't at an execution time limit for your application delay still isn't the proper way to do it.

How about dealing with a sensor using a single wire bus, where you need to send the data line LOW Leave it 18 milliseconds, then send it HIGH, wait 40 microseconds, then check that the device has responded. Delay is the perfect tool for the job here.

As Arrch pointed out above code, only has to be good enough to do what is required. If digitalWrite() does what you need then it is OK to use it of course, but direct port manipulation is "better" so surely that should be used instead....

UKHeliBob: As Arrch pointed out above code, only has to be good enough to do what is required. If digitalWrite() does what you need then it is OK to use it of course, but direct port manipulation is "better" so surely that should be used instead....

Because there is a large disconnect in complexity between digitalWrite compared to port manipulation, and the case when you require port manipulation are a lot more specialized and much further down the road of learning. The difference in complexity between delay and using millis is small, even for a new programmer. Also you run into the limitations of delay so early and often that it is not at all the same thing as port manipulation. Just look at the sticky at the top of the Project Guidance section of the forum. Clearly enough newbies have run into this exact problem that it required a sticky.

Not to mention even the documentation on port manipulation includes this section detailing why not to use it.

Generally speaking, doing this sort of thing is not a good idea. Why not? Here are a few reasons:

The difference in complexity between delay and using millis is small, even for a new programmer.

I am sorry, but I can't agree with that. I think that you have forgotten what it is like to be new to programming.

Don't get me wrong, I am all in favour of teaching people to use millis timing when it is appropriate but in the real world it is difficult to see how to do this except by providing advice when problems arise. If users will not read the advice here on how to post a programming question then they surely won't read advice on using millis and why. Sometimes it is necessary to fail at something to learn a lesson.

m4778: ] Because there is a large disconnect in complexity between digitalWrite compared to port manipulation, and the case when you require port manipulation are a lot more specialized and much further down the road of learning. The difference in complexity between delay and using millis is small, even for a new programmer.

I actually think the conceptual difference between using millis() rather than delay() is significantly greater for a newcomer than the difference between digitalWrite() and port manipulation.

It can be very hard to explain some concepts until the newcomer actually experiences the shortcoming that requires them.

nilton61: The delay() is bad. It is blocking code. It is a bad habit. It is oversimplification. It creates far more problems than it solves.

This all true ... but ... an important quality for a teacher is to be able to visualize what the student is capable of dealing with. It is all too easy for experts to lose sight of the time when they were a beginner. Many students can be turned off a subject by an excessive zeal for the doing things "properly or not at all".

In my book the first objective is to get the student interested. Only then will there be a receptacle for the theory.

...R