I worked on code recently that required precise timing of microsecond intervals from an ISR. Issues that surfaced forced me to deep dive into the Arduino timer functions (see also thread on micros) and some caveats to consider.
The wiki on delayMicroseconds() has a good section on caveats/issues, but it may not be obvious at first what these imply. The delayMicroseconds() function will provide accurate delays as may be needed for some types of applications (e.g. SofwareSerial bit banging) by disabling interrupts during its core delay code. Still, interrupts (e.g. timer overflow, serial, others) may execute before the delayMicroseconds function returns and thus upset precise timing. As a consequence, time critical code will need to guard itself against interrupts even though delayMicroseconds is used. As such, the function does not provide a proper solution to time critical code in itself and it also has other negative consequences as in the following:
In LiquidCrystal bundled with the Arduino IDE 0016, the functions “clear()” and “home()” use delayMicroseconds to pause for 2000 microseconds. Since interrupts are disabled for the duration of the delay, this will cause a loss of one timer overflow interrupt (delays in excess of 1024 microseconds) for every call to home and clear. Considering that the LiquidCrystal library as such does not need exact microsecond interval timing, but rather a guaranteed minimum delay, this is an unnecessary cost to pay as it is likely to upset other code.
So why not use delay() then when longer delays are needed (e.g. 2000 microseconds)? The issue here is that delay depends on millis() which in turn depends on the timer0 overflow interrupt. The delay() function may return immediately when called for a 1 millisecond delay because the overflow interrupt may be only 1 microsecond away. So what if we ask for a two millisecond delay then? Every overflow interrupt is actually 1024 microseconds apart and the remainder (24 microseconds) is accounted for by the millis() function such that on occasion it gets incremented twice for a single timer0 overflow interrupt. As a consequence, even a two millisecond delay may return immediately with no delay whatsoever besides the function call overhead.
Further, if we use the Serial library to receive data at 9600 baud, a new character (8 bits + start and stop bit) will arrive roughly every millisecond. Any calls to delayMicroseconds with delays in excess of the per-character time may result in lost characters for data received during the delay. At 57600 baud, the per-character interval is as low as 174 microseconds.
My proposal is that the interrupt guard used within delayMicroseconds should be removed. This will allow use of this function for guaranteed minimum delays without upsetting overflow, serial or other interrupts. Then what about existing code relying on precise delays? Unless this code guards itself against interrupts, it is actually non-the-worse since interrupts get re-enabled by delayMicroseconds prior to return anyway.
Further I suggest that the delay() function is rewritten to properly deal with short delays. As it is implemented now, it may be as much as 20% off even for a 10 millisecond delay.