Micros() within interrupt - limitations or alternatives?

I have an ISR that generally executes very quickly, but needs to test for signal loss timeouts. I believe that micros() will work within an interrupt, at least up to some point before the clock overflows. Can anyone tell me what is the minimum time (if any) I can count on from micros() working in an ISR?

Alternatively, can you help suggest alternative code -- a simple counter, loop or such -- that will allow me to approximate micros() on a standard 16MHz Arduino?

I believe that micros() will work within an interrupt

What do you mean?

You want to cal micros() inside an interrupt? or are you concerned you will drop micros interrupts if you stay in your ISR too long?

I want to call micros() from with an interrupt; I need a way to measure time within the ISR to test for timeout situations.

micros() just reads a variable it doesn't do any processing.
As far as I'm aware it works inside an interrupt.

Can anyone tell me what is the minimum time (if any) I can count on from micros()

What do you mean.

Do you mean the resolution of micros() ? Or are you concerned with rollover (if its rollover, there is a posting by Nick Gammon I think which describes why rollover is not an issue when comparing times)

CPARKTX:
I have an ISR that generally executes very quickly, but needs to test for signal loss timeouts.
Alternatively, can you help suggest alternative code -- a simple counter, loop or such -- that will allow me to approximate micros() on a standard 16MHz Arduino?

I think you would get much more useful answers if you explain WHAT you are trying to achieve rather than focussing on a question about HOW to do some part of the problem. There may be other approaches.

As usual, post your code. As usual, please put it within code tags [+code] [+/code] (without the + signs).

...R

As ever, the general answer is that you do not measure time with code inside an interrupt.

As pointed out, you presumably need to explain exactly what it is you want to do rather than simply throwing terms about such as "signal loss timeouts" which are in themselves, meaningless to anyone not privy to your personal idiosyncratic thought processes. :smiley:

Now the point here is that you are apparently using an interrupt to respond to something that is so important that it must be attended to within the time it takes to execute your primary "loop" which in general would be expected to cycle well within a millisecond. If such an event is connected to an interrupt, then the interrupt code should not wait for a further event, but that further event should itself be connected to an interrupt so that the necessary decision can be made at that latter point - within the second interrupt service routine.

Such questions however, are most often based on an expectation that all these events cannot be dealt with within the main processing loop which actually indicates that the main processing loop is not written properly in the first place. Since interrupts themselves have an overhead, it is more often than not more efficient to handle complex events within the main processing loop.

That said, millis() and micros() should do nothing more than read variables, involve no waiting and should in fact assemble as macros, not functions, so should confer negligible penalty.

CPARKTX:
Can anyone tell me what is the minimum time (if any) I can count on from micros() working in an ISR?

micros() should give an accurate result until a millisecond elapses. However you are not recommended to waste a millisecond inside an ISR.

Paul__B:
That said, millis() and micros() should do nothing more than read variables, involve no waiting and should in fact assemble as macros, not functions, so should confer negligible penalty.

They are functions, and micros() reads the hardware timer. millis() should return the most recent value (from the most recent interrupt) and micros() will be accurate up to 4µS (because of the hardware prescaler).