do while in an interrupt

I want to program a short delay inside an interrupt that is generated by Timer1.

delayMicroseconds() doesn’t seem to work, as if there is a conflict with my having set Timer1. So I tried the following code to check micros() in a loop until the appropriate time has passed. This works fine in the main loop, but it hangs the program when it’s in the Timer1 interrupt routine, or even when it is in the int.0 interrupt.

Any suggestion? I’ve spent a couple hours on this.

tmp = micros();
  while (micros() - tmp < 500) {
  }

Any suggestion?

You could try posting your code.

micros does not increment when an ISR is being run.

And 500 micros is 100x to long a delay to use in an ISR.

Mark

holmes4: micros does not increment when an ISR is being run.

?

micros does not increment when an ISR is being run.

I'm no expert but I am guessing what he means is that the definition of an interrupt is that the processor stops everything else it is doing to do a specific thing and so it can't run a delay but I found this post:

http://forum.arduino.cc/index.php/topic,41819.0.html

Where it says this :

A- Is there some rule against using a delay in an interrupt call?

B- If so, why does delayMicroseconds() work? (not good for me who needs a longer delay though)

So I'm guessing the "?" is a reference to the contradiction between "A" and "B".

There is no contradiction, because delay != delayMicroseconds.

However, any delay (no function) in interrupt context should be seriously thought about.

delay != delayMicroseconds.

Ok. Got it. delay(mS); is NOT ALLOWED but delayMicroseconds is.

This might be relevant: http://www.iar.com/Global/Resources/Developers_Toolbox/C_Cplusplus_Programming/Interrupt%20Service%20Routine%20calling%20functions.pdf

raschemmel: This might be relevant: file:///M:/Temp/ARDUINO%20UNO/DATASHEETS/Interrupt%20Service%20Routine%20calling%20functions.pdf

We can't read a file on your M: drive. ;)

holmes4: micros does not increment when an ISR is being run.

And 500 micros is 100x to long a delay to use in an ISR.

I see. Do you have a reference for this? It says on the Reference page for attachInterrupts that delay() and millis() won't work, but it doesn't say anything about delayMicroseconds() and micros().

The reason I need this is that I have pulses coming in once per second from my GPS, which trigger Timer1 to generate 5 interrupts, once every 200 ms. The interrupts are used to make a 5 Hz signal on a digital output. But the clock on the Arduino is not perfectly accurate (+/- 0.2%). To compensate, I want to measure exactly how far off the clock is, and then in the timer interrupt routine, wait the appropriate number of microseconds before setting the digital output high.

I guess the way to go is not to set the pin high in the timer interrupt itself, but to set a flag in the interrupt, and then detect the flag in the main loop and use delayMicroseconds() from there to set the pin high.

Also, delayMicroseconds() seems to conflict with Timer1, which is why I'm not using that instead of the do while.

It says on the Reference page for attachInterrupts that delay() and millis() won't work

That's because they don't.

but it doesn't say anything about delayMicroseconds() and micros().

That's because they do.

But you really shouldn't - at least, not for significant amounts of time.

raschemmel:
I’m no expert but I am guessing what he means is that the definition of an interrupt is that the processor stops everything else it is doing to do a specific thing and so it can’t run a delay but I found this post:

All an interrupt does is let the processor stop doing one thing, and do another instead (the interrupt service routine). It doesn’t stop everything else. It does however turn interrupts off, so it cannot do a second interrupt while it is doing the first.

The hardware timers continue to run (as does the ADC converter, I2C, SPI, the serial port etc.).

The micros() function directly reads the timer hardware, and thus can continue to return good results, for about a millisecond. After that it will have missed an “overflow” interrupt.

http://www.gammon.com.au/interrupts

tmp = micros();

while (micros() - tmp < 500) {
 }

We really do not encourage attempting to waste time inside an ISR. This can easily lead to missing other important interrupts, like incoming serial data. However as to why that particular code didn’t work, seeing the data type of tmp would help.

I want to program a short delay inside an interrupt that is generated by Timer1.

Probably the real question is: why do you want to do this?


Can’t you adjust the time interval used for Timer 1?

Look, an interrupt itself is going to take (say) 5 uS anyway, so another one or two won’t matter. Delaying for 500 uS is pushing it.

The micros() function directly reads the timer hardware, and thus can continue to return good results, for about a millisecond. After that it will have missed an "overflow" interrupt.

I think you might have it here. Micros() seems to increment for a ms or so inside the interrupt, but no further.

I will take the delay out of the interrupt routine.

megascops: But the clock on the Arduino is not perfectly accurate (+/- 0.2%). To compensate, I want to measure exactly how far off the clock is, and then in the timer interrupt routine, wait the appropriate number of microseconds before setting the digital output high.

I should also point out that there is jitter with interrupts. As explained on my page ( http://www.gammon.com.au/interrupts ) an interrupt may be delayed a bit (especially if another interrupt is currently running). So for example if a Timer 0 interrupt is being serviced, and say that takes a few microseconds, attempting to compensate for Timer 1 by a few microseconds is probably not going to help.

If you stick to letting the timer itself generate the signal (which you can do) then no interrupts are necessary and it will be accurate as it can be, bearing in mind the resolution of the interrupt and the processor clock accuracy.

http://www.gammon.com.au/forum/?id=11504

[quote author=Nick Gammon link=topic=264058.msg1862565#msg1862565 date=1409438790] If you stick to letting the timer itself generate the signal (which you can do) then no interrupts are necessary and it will be accurate as it can be, bearing in mind the resolution of the interrupt and the processor clock accuracy. [/quote]

Timer inaccuracy is exactly what I am trying to compensate for. I want to have the 5 Hz pulses on two Arduinos at the same time within a few 10's of microseconds. The GPS pulses on the two units start them off together at the top of each second, but the Timers for the subsequent 4 pulses get out of sync by up to 2 ms by the end of the second (because of the inaccuracy of the timers). I am working on moving the microsecond delay into the main loop.

Update: It's working great now.

I'm just curious about why adjusting the timer value can't work. Whether a constant or something you work out at runtime, you should be able to tweak the timer value.

Theory: Timer 1 is a 16 bit timer, so it can count up to 65536 (technically 0 to 65535).

The interval depends on the prescaler, from which you can choose: 1, 8, 64, 256, 1024.

Assuming you are running the processor at 16 MHz, that translates to a precision of:

Prescaler  Period (µS)
    1        0.0625
    8        0.5 
   64        4.0
  256       16.0
 1024       64.0

Given that you are trying to time 1/5 of a second (200 mS) and the timer can count up to 65536 then you would need a prescaler of 64. That is:

Count to 50000 * 4 µS = 200000 µS (200 mS)

With a prescaler of 64 you can then tweak that to +/- 4 µS at a time. So for example, if it is 4 µS too slow you count up to 50001.

Now your original post gave the impression you were dealing with much larger amounts (you quoted 500 µS) so it would seem reasonable that you could get it to within a couple of µS accuracy just by changing the count.

[quote author=Nick Gammon link=topic=264058.msg1862631#msg1862631 date=1409445328] I'm just curious about why adjusting the timer value can't work. Whether a constant or something you work out at runtime, you should be able to tweak the timer value.

Theory: Timer 1 is a 16 bit timer, so it can count up to 65536 (technically 0 to 65535). [/quote]

You're absolutely correct, Nick. I mispoke. I am using Timer 2, which is only 8 bit and therefore not high enough resolution. The reason I am using Timer 2 is because I am also using the RCArduinoFastLib to read and control servos, which, I believe, uses Timer 1.

I am using the pulses on one Arduino to start a sonar pulse, and on the other to do analog reads to detect the pulse. The more precisely I can sync those two processes, the more precise my measurement of the distance between the two objects. Since I also want to measure the relative velocity of the two objects, I need the distance measurements to be very precise. So far, I have it to about +/-2 inches, with a range of about 80 feet. But this is with hand-calibrating the difference between the clocks of the two arduinos. I am trying to automate the calibration with the code I am working on.

update: And... it works.

David