calling millis() or micros() from interrupt code

I’m running into a problem when calling millis() or micros() from within interrupt code. Strange things happen to timing, all of a sudden all sorts of polling dependent on time start running erratically, e.g. checking for millis() inside loop() gives unexpected results.

The problem comes up in both Arduino-0012 and 0013 (Mac OS X).

Here’s an example:

    // width is the pulse length in usecs, for either polarity
    static uint16_t last;
    uint16_t width = micros() - last;
    last += width;

When I comment out the call to micros(), all is well.

But here’s the strange bit: when I replace the call to micros() with my own copy of it, everything works as expected again!

static unsigned long myMicros() {
    extern volatile unsigned long timer0_overflow_count;
    uint8_t oldSREG = SREG;      
    uint32_t t = TCNT0;
    if ((TIFR0 & _BV(TOV0)) && (t == 0))
        t = 256;
    uint32_t m = timer0_overflow_count;
    SREG = oldSREG;
    return ((m << 8) + t) * (64 / clockCyclesPerMicrosecond());

    // width is the pulse length in usecs, for either polarity
    static uint16_t last;
    uint16_t width = myMicros() - last;
    last += width;

I’m totally stumped by this behavior. As you can see, the above is a work-around. But it sure would be nice to understand what is happening and get the proper fix in there, wherever that is…

What is going on here? Does anyone have a suggestion?


Looks like gcc 4.3.0 can generate bad code. The problem is solved by upgrading the gcc 4.3.0 that came with Arduino-0013 to gcc 4.3.2.

I replaced /Applications/arduino/tools/avr with a symlink to /usr/local/AVRMacPack-20081213, also installed on my system.


micros() and millis() don't increment while you're in an interrupt routine, this will cause small timing differences that might be causing you problems.

I can't get my interrupt routines to compile because I have a call to micros(). I get a "error: 'micros' was not declared in this scope In function" in the function I have associated with my interrupt. Did you ever encounter this problem?

Isn't that a different issue? I'm reading out the current value, not delaying...

As for losing millisecond "tick" interrupts, I assume that this won't happen if interrupt routines are quick enough, i.e. well under 1 msec. The tick (timer 0 overflow) will stay pending until the current interrupt routine exits.

Havn't seen the undeclared issue you mention. AFAIK, micros() was added in Arduino-0013, maybe the error came from trying to compile with 0012?

You solved the problem by upgrading the compiler to avr-gcc 4.3.2.

Version 4.3.0 (included with arduino-0012 and 0013) has a number of serious bugs - particularly in the OSX version. One of which is producing bad code whenever a function is called from inside an ISR. Basically the bug is that the generated code fails to preserve all of the affected registers in the ISR before calling the function. This leads to register corruption and all kinds of unpredictable behavior.

[EDIT] - Never mind. Found a syntax error. All is well. :-[ I seem to be having the same problem calling micros() from an interrupt in Arduino-0013 on Windows. Does anyone know if the upgrade to gcc 4.3.2 will help on Windows as well? If so how do I? I'm new to Arduino (but liking it so far!)

Thanks etracer, for the exact analysis and explanation. I see there's a nice workaround for 4.3.0 with the NewSoftSerial code I just found at

Cool. It feels a lot better when funny stuff has a logical explanation!


PS. If there isn't already - it might be an idea to set up a wiki page with GCC issues and info such as which version of GCC comes with which version of the Arduin IDE.

For reference, here's a (rather long) thread in these forums about NewSoftSerial where the interrupt bug was discovered and documented. Also included is the assembly work-around that I came up with.

NewSoftSerial Library: An AFSoftSerial update