Easiest Blinking a LED without using delay and millis

Normally, we use delay to blink an LED, but delay is bad, so we use millis() to avoid freezing up the MCU. but millis() is kinda bad because it's complicated to use. For timing, why not just use the loop? It will take some nano seconds for every looping, so Just add an increment counter to the loop and you get an elegant LED blinker. Is it the easiest way to blink a LED in arduino? and I wonder have anyone ever think of this technique?

blink a led about 1hz @ 8Mhz.

int led;
int timer;

void setup() {
  pinMode (8, OUTPUT);
}

void loop() {
timer++;
digitalWrite(8, led);
if(timer%10000==0){led=!led;}
}

blink a led about 1hz @ 16Mhz.

int led;
int timer;

void setup() {
  pinMode (8, OUTPUT);
}

void loop() {
timer++;
digitalWrite(8, led);
if(timer%20000==0){led=!led;}
}

Are you implying that this will not tie up the processor? Because that is your objection to using delay(). Do you think it is easier to do this, than to use delay()?

flyandance: and I wonder have anyone ever think of this technique?

yup.

But is it any good?

Nope.

:-)

The delay() and delayMicroseconds() are excellent for most timing requirements.

flyandance: blink a led about 1hz @ 16Mhz.

...until you change the code to do something useful. Then it will do whatever it wants to.

lg, couka

but millis() is kinda bad because it's complicated to use.

Nonsense. People go about their lives every day using the philosophy espoused in the blink without delay example, without even thinking about it.

"Meet me outside in 20 minutes." You'd have no trouble not showing up 40 minutes early or 2 hours late, would you?

flyandance: but millis() is kinda bad because it's complicated to use.

Strange, I find míllis() is actually very easy to use.

Code exaple for blinking frequency ca. 2 Hz:

const byte LED=13;

void setup() {
 pinMode(LED, OUTPUT);

}

void loop() 
{
  digitalWrite(LED, millis()>>9 &1); // for ca. 2 Hz blinking
}

For ca. 1 Hz blinking change bitshifting to 10 bits instead of 9:  digitalWrite(LED, millis()>>10 &1); // for ca. 1 Hz blinking

Blinking frequency does NOT depend on clock frequency of the microcontroller.

flyandance: Normally, we use delay to blink an LED, but delay is bad, so we use millis() to avoid freezing up the MCU. but millis() is kinda bad because it's complicated to use. For timing, why not just use the loop?

We don't want the CPU of the MCU tied up just counting off time.

We want -other- things to run while the led blinks, and those other things would change what count is needed to make a 1 Hz blink. When those other things include occasional user action responses there is no way to tell what count is needed one blink to the next and we use millis() or micros(). A timer interrupt could work but that's how micros() and millis() work and they let us time loads of events.

What's so hard about millis()?

The unsigned longs? They're like the hour hand on a round clock only they count past 4 billion instead of 12.

As long as you use unsigned variables and subtract the start time from the end time you will get the difference (subtract to get the difference) up to the maximum count which for millis is 49.71-some DAYS. With only the second hand of the clock you can get 59 seconds, with millis you get almost 2 months as your longest interval. And it never makes a difference if the end is less than the start, rollover does not matter with unsigned math, you always get the difference up to one time around minus 1.

time_difference = time_end - time_start;

On a round clock suppose it is 2 o'clock and you arrived at 10 o'clock and you want to know how long you've been there. The hour hand is at 2 so you subtract 10 by moving the hand backwards 10 (start time, when you arrived) and the hand now points to 4. You've been there 4 hours, clock 2 minus (move backwards for minus) clock 10 is clock 4. Same way for unsigned, the round clock is unsigned base 12.

BTW, I use unsigned int with ( millis() & 0xFFFF ) to time events of 1 minute or less.

flyandance:
Normally, we use delay to blink an LED, but delay is bad, so we use millis() to avoid freezing up the MCU. but millis() is kinda bad because it’s complicated to use. For timing, why not just use the loop? It will take some nano seconds for every looping, so Just add an increment counter to the loop and you get an elegant LED blinker. Is it the easiest way to blink a LED in arduino? and I wonder have anyone ever think of this technique?

blink a led about 1hz @ 8Mhz.

int led;

int timer;

void setup() {
  pinMode (8, OUTPUT);
}

void loop() {
timer++;
digitalWrite(8, led);
if(timer%10000==0){led=!led;}
}




blink a led about 1hz @ 16Mhz.


int led;
int timer;

void setup() {
  pinMode (8, OUTPUT);
}

void loop() {
timer++;
digitalWrite(8, led);
if(timer%20000==0){led=!led;}
}

I have several “arduinos”. One is a UNO, another is a Pro Trinket 3V, and yet another is an Arduino 101. They all run at different clock speeds. This means that I need to rewrite my code for each of these boards, or include some complicated compile-time code. How is that simpler than including the code below in my loop function?

  if (millis() - previousSecondMillis >= oneSecond) {
    previousSecondMillis += oneSecond;
    // ... do stuff once a second ...
    }

(ps, delay is not bad. It’s just misunderstood.)

Hi,
If I use OPs 16Mhz and add a couple of lines, its not 1Hz flash anymore.

int led;
int timer;
int ledPin = 13;
int testdummy = 12;
void setup()
{
  pinMode (ledPin, OUTPUT);
}

void loop()
{
  timer++;
  digitalWrite(ledPin, led);
  if (timer % 20000 == 0)
  {
    led = !led;
  }
  for (int y = 0; y < 10; y++)
  {
    digitalWrite(testdummy, led);
  }
}

I think the OP needs to realise that the timer variable will be incremented WHEN the code gets to the top of the loop, this depends on how much code it has to process before returning to top.

millis are the way to go.

The OPs timer variable even when unsigned long will have same overflow as millis.

Tom… :slight_smile:

Also, simple “busy loops” can get optimized in unexpected ways by the compiler.
The following code snippet will typically execute in zero time:

for (long i=0; i<1000000; i++)
   ;

Because the compiler says: “inside of loop does nothing, i is gone when the loop is done, therefore the net effect is nothing, therefore I can omit all the code that this would take.”

Another way to think about it is to consider millis() and other interrupt driven tasks will happen [u]regardless[/u] of what else the CPU is doing.

As soon as you ask the CPU to participate in some other activity, the timing of loop() will vary slightly, but if you run that same variation a million times a second.... it's adds up.

PaulS: Nonsense. People go about their lives every day using the philosophy espoused in the blink without delay example, without even thinking about it.

"Meet me outside in 20 minutes." You'd have no trouble not showing up 40 minutes early or 2 hours late, would you?

Actually, it's very logically. Since blinking LED is not serious stuff, so instead of saying, hey meet me outside in 20 minutes, I will say, go eat a watermelon, and meet me outside once you are done. Meanwhile, no one is stopping you from Netflix and chill with your mom, or do whatever you wish to do while eating your melon. There is also another hidden benefit for this, and that is depending on how fast you come out, I can guess whether you are like watermelon or not, so next time I know what fruit to bring to you.

Just remember that when you want to time something for real, don't write some silly excuse-code.