[SOLVED] Why does delay() not hang in an ISR?

I’ve always believed, (and on forgetful occasions years ago, personally experienced) that on AVR-based Arduinos, delay() should hang when called from within an ISR, or when the interrupts are off.

Furthermore, looking at the code for delay() in wiring.c, one would conclude that delay() should hang, or never return, when the interrupts are off.

However, this is not the case, in fact within an ISR, or when the interrupts are off, delay() apparently does nothing. (Arduino Pro Mini used for all tests).

This came up in another discussion recently, and the example that I present below comes from the Arduino reference page for the MsTimer2 library.

The program below blinks the on board LED with a one second period, regardless whether the delay(2000) statement is present within the timer ISR. I can’t explain why, and hope someone else can!

// Toggle LED on pin 13 each second
#include <MsTimer2.h>

void flash() {
  static boolean output = HIGH;

  digitalWrite(13, output);

  delay(2000);  //<< *** This delay() is ignored!

  output = !output;
}

void setup() {
  pinMode(13, OUTPUT);

  MsTimer2::set(500, flash); // 500ms period
  MsTimer2::start();
}

void loop() {
}

For another example, take Blink:

/*
  Blink
 Turns on an LED on for one second, then off for one second, repeatedly.
 
 This example code is in the public domain.
 */

// Pin 13 has an LED connected on most Arduino boards.
// give it a name:
int led = 13;

// the setup routine runs once when you press reset:
void setup() {                
  // initialize the digital pin as an output.
  pinMode(led, OUTPUT);     
}

// the loop routine runs over and over again forever:
void loop() {
  digitalWrite(led, HIGH);   // turn the LED on (HIGH is the voltage level)
  delay(100);               // wait 0.1 s
  digitalWrite(led, LOW);    // turn the LED off by making the voltage LOW
  cli();
  delay(900);    //<<*** does not hang, just ignored
  sei();
  delay(100);
}

That's a good question...

From https://github.com/arduino/ArduinoCore-avr/blob/master/cores/arduino/wiring.c:

void delay(unsigned long ms)
{
	uint32_t start = micros();

	while (ms > 0) {
		yield();
		while ( ms > 0 && (micros() - start) >= 1000) {
			ms--;
			start += 1000;
		}
	}
}

Hmmm... I don't think micros() is affected by interrupts being disabled. It reads directly from hardware timer. So, indeed, how can delay() not work as normal?

Here is micros():

unsigned long micros() {
	unsigned long m;
	uint8_t oldSREG = SREG, t;
	
	cli();
	m = timer0_overflow_count;
#if defined(TCNT0)
	t = TCNT0;
#elif defined(TCNT0L)
	t = TCNT0L;
#else
	#error TIMER 0 not defined
#endif

#ifdef TIFR0
	if ((TIFR0 & _BV(TOV0)) && (t < 255))
		m++;
#else
	if ((TIFR & _BV(TOV0)) && (t < 255))
		m++;
#endif

	SREG = oldSREG;
	
	return ((m << 8) + t) * (64 / clockCyclesPerMicrosecond());
}
return ((m << 8) + t) * (64 / clockCyclesPerMicrosecond());

m is the millis count, maintained by interrupts (T0 Overflow.) t is the current timer count.
m should not increment if interrupts are off, so the micros() value should loop with the low order bits changing, but not the upper bits. Like:

0x11000010
0x11000020
  :
0x110000F0
0x11000000
0x11000010

(Something like that, anyway…)

Just summing up what has been said and adding a conclusion.

delay() is coded so :

void delay(unsigned long ms)
{
 uint32_t start = micros();

 while (ms > 0) {
 yield();
 while ( ms > 0 && (micros() - start) >= 1000) {
 ms--;
 start += 1000;
 }
 }
}

micros() more or less freezes after one overflow if interrupts are suspended, as has already been pointed out, so start here becomes larger than micros() the next time start += 1000 is executed.

(micros() - start) >= 1000

and since (micros() - start) is an unsigned number calculation, the result suddenly becomes huge (well, anyway, much bigger than 1000) so the statement above is always true and the loop terminates after a few (ms) iterations.

edit:

This "rougher" version of delay() appears to work correctly even in a cli() / sei() block. I'm still looking into why it doesn't hang, though.

void delay1(unsigned long ms)
{
  uint32_t start = micros() & 0xFFFFFF00 ;

  while (ms > 0) {
    // yield();
    while ( ms > 0 && ( (micros() & 0xFFFFFF00) - start) >= 1000) {
      ms--;
      start = micros() & 0xFFFFFF00 ;
    }
  }
}

Agreed. The hardware timer continues to increment the low order bits of the microsecond variable, but the upper millis bits are static. So the while loop in delay() is dramatically sped up.

If correct, is this behavior intentional, or accidental? If the former, it is an ingenious way to prevent the hang. But then, the hang is a good learning experience.

One of my few mantras in life (I try not to adopt them if I can help it) is "always suspect incompetence before conspiracy". Doesn't apply to myself, of course, I know it's definitely incompetence in those cases.