Delay an incoming square wave signal with high accuracy

Hello everyone, I am new to programming and also in this forum.
I already spent severay hours trying to solve my problem but haven’t been succesful so far.

What I want to do:

I have a square waved signal of a frequency generator with periods in the range of 1 ms to 30 ms.
I need copy that signal with an delay (adjustable) also in the same range, so i want to be able to shift at least in the phase range of 2pi between both signals.

In order to achieve high accuracy and to trigger at time with lowest possible jitter, I have used an interrupt on pin 2(0).

attachInterrupt(0, CreatePulse , RISING);
.
.
.
void CreatePulse() {
delayMicroseconds(Delaytime);
digitalWrite(PinOut, HIGH);
delayMicroseconds(T/2);
digitalWrite(PinOut, LOW);
}

(I know that one should not use delays in interrupt functions, but as this is the only task of my arduino I did so anyway)

The problem starts to occur as soon as “DelayTime” is getting close to T/2.

My next approach was to use the Timer1 library in order to generate a timed interrupt after “DelayTime” passed and to start then the pulse generating procedure…but I could not solve the problem like that ayway, because interrupts cant be interrupted by each other!?!
So effectively I mis a rising event on my input and the output becomes crap again.

I would be glad if someone could maybe give me a hint or some advices for solving my problem.

Quick suggestion: don't use interrupts, code like BlinkWithoutDelay using micros() instead of millis().

Interrupts can not normally be nested, and even if so this would kill your current approach. You'll get even more trouble with delays > T/2, if e.g. the frequency increases while your delay remains fixed.

Are you turning off the interrupts when you set the variable T?

Hello and thanks for you answers,

plancette: Are you turning off the interrupts when you set the variable T?

T is set once in the setup and never again changed because the input signal will be the same during several hours (at least one measurement).

DrDiettrich: Quick suggestion: don't use interrupts, code like BlinkWithoutDelay using micros() instead of millis().

Interrupts can not normally be nested, and even if so this would kill your current approach. You'll get even more trouble with delays > T/2, if e.g. the frequency increases while your delay remains fixed.

Yes, i thought about using micros already...but is there any other option to react as fast as the external interrupt? I would like to keep the timing-jitter of my output in the range of +-20us. So checking constantly with "while" and taking the time with micros might be an option. By the way...I also tried to use micros inside the interrupt but I only got values in a small range...(something betwenn 4000 and 5000 when comparing several times token while an interrupt). The idea was to just take notes of the event-time in order to react possibly fast outside the interrupt waiting for the sorresponding time to raise or lower my output.

micros() is unusable within an ISR, because it's updated by another; now blocked; interrupt.

Using micros() within loop() should work quite precisely, with only an almost constant offset due to loop processing and comparison.

DrDiettrich: micros() is unusable within an ISR, because it's updated by another; now blocked; interrupt.

Using micros() within loop() should work quite precisely, with only an almost constant offset due to loop processing and comparison.

I guess I will try the loop and maybe i can design it to give me an constant delay in order to keep jittering low... That's actually the only problem i have to care about.

I would permanently read the input and compare it with a previous value... But these are already several steps which cost me accuracy detection the moment of raise/fall on my input...

Fri_Co:
In order to achieve high accuracy and to trigger at time with lowest possible jitter, I have used an interrupt on pin 2(0).

Always a bad choice.

Using interrupts is just as likely to cause more jitter. Why? because you will be fighting the timing interrupt.

Precise variable delays can be implemented by using for loops with interrupts turned off.

The only problem with using polling in the loop is that you may wish to be able to adjust the timing “on the fly”. It is essentially going to be completely impractical to do this and implement the delay simultaneously, so you need a switch in the loop (that is, read a switch) to swap over to the procedure which makes the adjustment.

Due to the fact that the larger than T/2 shift is just a nice to have for my application, I decided to only work with interrupt and timer1 interrupt. This works fine for delays of (T/2-15)us, which is enough for what I need. Anyway if in the future someone wants to share a solution, this can be done in this post. Thanks so far for all the advices and comments.

Best, Frieder

I'm with DrDiettrich in suggesting you ditch interrupts and just use a tight while(1) loop (not loop() as that goes off and does other stuff between iterations). You say the arduino is doing nothing else so poll the input pin (using direct hardware calls) record the micros() time the change happened & apply an offset for when the output need to change.

Do Has the input pin changed? Yes, Calculate the offset time the output pin needs to change. Has the calculated time arrived? Yes, Flip the output pin. Loop

Riva: I'm with DrDiettrich in suggesting you ditch interrupts and just use a tight while(1) loop (not loop() as that goes off and does other stuff between iterations). You say the arduino is doing nothing else so poll the input pin (using direct hardware calls) record the micros() time the change happened & apply an offset for when the output need to change.

Do Has the input pin changed? Yes, Calculate the offset time the output pin needs to change. Has the calculated time arrived? Yes, Flip the output pin. Loop

Thanks, I will try that. But anyway the advantage of the external interrupt is that I can react faster on a change of the input pin. Imagine the change happens right after I checked if there was a change, then I would detect this after one complete cycle. And I would have to implement a couple of more stuff, because the micros() will have an overflow after 71 minutes but I need operation times in the range of up to 24h. So the while(1) would quite extend a bit, which will cost me a few us in accuracy...which really is the highest priority for now.

As millis() and micros() work with unsigned ints, the overflow is handled automatically and is nothing to worry about. The math still works.

If your Arduino has to do more jobs, then your best option is setting a timer interrupt. Have a look at phase cutting (AC control) as they're doing just that: interrupt upon incoming pulse (zero crossing), set a timer to some time in the future, and in that timer interrupt set the output pin, then set another timer interrupt for switching it off again. That can get you pretty high accuracy.