Delay in timer interrupt

Hi, I have a problem.
I use a Mega and the folowing programm:

#include <TimerOne.h>
const int led = LED_BUILTIN;  
const uint8_t bit_timings[2] = {116, 58};
const Uint8_t Number;

void setup(void)
{
  pinMode(led, OUTPUT);
  Timer1.initialize(500000);
  Timer1.attachInterrupt(DoINT); 
  Serial.begin(57600);
}

int ledState = LOW;
volatile unsigned long blinkCount = 0; 

void output_bit(uint8_t bit_val)
{
  digitalWrite(In3, LOW);
  digitalWrite(In4, HIGH);
  delayMicroseconds(bit_timings[bit_val]);
  digitalWrite(In3, HIGH);
  digitalWrite(In4, LOW);
  delayMicroseconds(bit_timings[bit_val]);
}

void DoInt(void)
{
  if (ledState == LOW) {
    ledState = HIGH;
    blinkCount = blinkCount + 1;  // increase when LED turns on
  } else {
    ledState = LOW;
  }
  digitalWrite(led, ledState);
  for (i = 1; i < 16; i++) {
  outputbit(1);
  }
}

void loop(void)
{
  unsigned long blinkCopy; 
  
  while (Serial.available() == 0 ) {
  }

  int Val = Serial.parseInt();
  while (Serial.available() > 0 ) // While have something
  {
    int trash = Serial.read();   // Cleaner buffer
    delayMicroseconds(10);
    Number = Val;
  }
}

When I use the funtion output_bit() in the loop it works perfect. But because i want to expand the loop section with some waiting for input etc. I would like to use a timer-interrupt.

Therefore I found the TimerOne.h and this makes the LED blinking. But the function output_bit() is not working correctly.

I think it's problem is the delay in the output_bit(). This timing is very preciese and it may be not work inside the timer-interrupt.

Has anyone a solution?

thanks Trebbie

outputBit calls delayMicroseconds, which depends on the timer0 interrupt to count microseconds across the timer0 rollover oops. I'd confused the delayMicroseconds() mechanism with the delay() and micros() mechanisms.

I'm not crazy about the idea of any delay type calls in an interrupt, but...

While delay() depends on micros(), which depends on the timer0 interrupt, delayMicroseconds() doesn't seem to in the AVR code, at least not that I can see. It appears to just use a subtract one and compare to zero loop. Here's what delayMicroseconds() looks like for a 16MHz clock:


void delayMicroseconds(unsigned int us)
{
	// call = 4 cycles + 2 to 4 cycles to init us(2 for constant delay, 4 for variable)

	// calling avrlib's delay_us() function with low values (e.g. 1 or
	// 2 microseconds) gives delays longer than desired.
	//delay_us(us);
	// for the 16 MHz clock on most Arduino boards

	// for a one-microsecond delay, simply return.  the overhead
	// of the function call takes 14 (16) cycles, which is 1us
	if (us <= 1) return; //  = 3 cycles, (4 when true)

	// the following loop takes 1/4 of a microsecond (4 cycles)
	// per iteration, so execute it four times for each microsecond of
	// delay requested.
	us <<= 2; // x4 us, = 4 cycles

	// account for the time taken in the preceding commands.
	// we just burned 19 (21) cycles above, remove 5, (5*4=20)
	// us is at least 8 so we can subtract 5
	us -= 5; // = 2 cycles,

	// busy wait
	__asm__ __volatile__ (
		"1: sbiw %0,1" "\n\t" // 2 cycles
		"brne 1b" : "=w" (us) : "0" (us) // 2 cycles
	);
	// return = 4 cycles
}

I'd like to see the OP clarify what they mean by "the function output_bit() is not working correctly". "not working correctly" doesn't help me understand what it is doing that it shouldn't ought to be.

1 Like

I cannot See what it is wrong

When it is in the loop the 16 ones send in the correct timing.

It is a part of a Programm for making a DCC signal for modelltrains. The code has to be 58 microsecondes and 116 microsecondes long.

When I put the command, with some more ones and zeros, in a normal function in the loop, the Train respons correctly, when I let send the code through the interrupt, it doesn’t work correctly.

So I presume the timing is incorrect.

You are, quite literally, the only one who can see what is wrong. All the rest of us can do is speculate. We can't see what you're seeing.

So here's some speculation.

In output_bit there are two calls to delayMicrosecond, each of which will delay for a minimum of 58 uSec.

That's 116 uSec per call to output_bit.

Your interrupt function calls output_bit 16 times.

That means each time the interrupt is called, it has cumulative delays of 1856 uSec; almost 2 mSec. Perhaps closer to 4 mSec if you need to send a bunch of 0 bits.

Ouch. I can't see that working out well.

This don't look right…

  for (i = 1; i < 16; i++) {
  outputbit(1);       // i?
  }

Did someone say?

a7

So the 4 mSec are to much?

It should output_bit(1), a typo is fixed. And should start by 0, also fixed

Yes, about a factor of 100x too much. Interrupts are supposed to be quick.

But is it only not nice and you should not do it, or does it give some problems with other funtions f.E Delay?

Please feel free to continue on your current path. I'm done.

There you go again.

a7

You may don't like it.

But if you want real help you have to provide more details
How does the whole signal-train of all bits look like?

Post a timing-diagram

You wrote about a DDC-signal for modeltrains.
There must be a specification available how these signals look like

the code above outputs 15 times a "1"
111111111111111

The condition i < 16 means i counts from 1...15 which are 15 iterations
because i = 16 < 16 is false

I assume that you want to send data that changes
sometime sending 100111110101111
sometime sending 000110000111001
sometime sending 110000011100011
etc.

This means your call to

output_bit(1) // ALWAYS send a ONE

has to look somehow like this

output_bit( variable_that_can_be_zero_OR_one ) // send ZERO or ONE 

If my assumption is correct depends on this DDC-signal specification that you have to provide

you mentioned that you have a code-version that works as expected
You should post this code-version too

What you are doing with the code-above is not really using a timer-interrupt

Real use of a timer-interrupt would be to setup a timer-interrupt each 58 µsecs to call the interrupt-routine (= that one function that gets called whenever the timer-interrupt is invoked)

call the function once every 58 µseconds is a frequency of
1 / (58/1000000) = 17241.37 Hz

should be doable except that you can't create all frequencies because a timer-interrupt is based on binary numbers and a divider.

Inside this interrupt-function the bits will be "processed" without any delay because the function ITSELF is called each 58 µseconds

I googled for detail-information about how to setup the timer-interrupt details

and found this website that offers calculation for different arduinos
https://www.arduinoslovakia.eu/application/timer-calculator

Example-output

// AVR Timer CTC Interrupts Calculator
// v. 8
// http://www.arduinoslovakia.eu/application/timer-calculator
// Microcontroller: ATmega2560
// Created: 2024-06-19T06:47:35.658Z

void setupTimer1() {
  noInterrupts();
  // Clear registers
  TCCR1A = 0;
  TCCR1B = 0;
  TCNT1 = 0;

  // 17241.379310344826 Hz (16000000/((927+1)*1))
  OCR1A = 927;
  // CTC
  TCCR1B |= (1 << WGM12);
  // Prescaler 1
  TCCR1B |= (1 << CS10);
  // Output Compare Match A Interrupt Enable
  TIMSK1 |= (1 << OCIE1A);
  interrupts();
}

void setup() {
  setupTimer1();
}

void loop() {
}

ISR(TIMER1_COMPA_vect) {
}

Your bits will then be processed inside function

ISR(TIMER1_COMPA_vect) 

The name inside the parenthesis "TIMER1_COMPA_vect" is fixed

TIMER1_COMPA_vect

The digit sepcifies which timer you want to use
An Arduino Mega 2560 has 5 timers.
Some timers are used for things like function millis()
You have to look up which timer is used for what

Hi everybody,

thanks for the help.
I give some more information:
A DCC-signal is a pulse signal from small pulses made by the arduino.
such a signal contains 0 and 1 signals. The decoder in the train understands that when a puls of 58 uSec is send this is a 0 and when a pulse of 116 uSec is send this is a 1.

The pulses are made by a L298 H-bridge, bij switching the In3 and In4 for the amount of time.

The most important DCC-signal is the one for speed and is made as:
16x1 for activation - stopbit 0 - 8bits for address - stopbit 0 - 8bits speed - stopbit 0 - 8bit Errorcode - stopbit 1.

So an example would be:
11111111111111111 0 00000011 0 10110011 0 10110000 1

In my Sketch I made some functions to alter the address, speed and Errorcode.
Now in the original loop there is function that will be called everytime that is:

void ZendLok() {
  uint8_t i,; 
  MaakXOR(); 

  for (i = 0; i < 16; i++) {
    output_bit(1);
  }
  output_bit(0);
  for (i = 0; i < 8; i++) {
    output_bit(bitRead(Lok.adres,7-i));
  }  
  output_bit(0);
  for (i = 0; i < 8; i++) {
    output_bit(bitRead(Lok.speed,7-i));
  } 
  output_bit(0);
  for (i = 0; i < 8; i++) {
    output_bit(bitRead(Lok.XOR,7-i));
  } 
  output_bit(1);

}

The funktion MaakXOR() changes the Errorcode.
The is Lok is setup as:

struct LokPakket{
  uint8_t adres;
  uint8_t speed;
  uint8_t XOR;
};

But the most important thing to remember is, the signal should be send every 0,25 sec, otherwise the trains stops.

In the loop() there are also functions to change the speed or the address, with a hold in it. (wait for input) This last mostly longer than 0.25 sec. so the train stops if I change something.

So my solution to tackle this is, a timer-interrupt.

But that is not so easy...

Maybe eliminating the 250ms wait-for-input pause from loop would be a better solution?

Have you seen this by Robin2 (@Robin2)?

Doing something four times second and gathering serial input and watching a few pushbuttons and probably much more will be harder to do with interrupts than without, in direct proportion to how much interrupt programming you've done.

OIC @DaveX has pointed you to one very good tutorial on non-blocking serial communications.

Blink Without Delay, an example in the IDE, will get you started thinking about programming in a non-blocking style.

Buttons are easy, too. I think everyone should learn how to do it, but you can use a library (ezButton doesn't suck) for now with no shame.

a7

What loop where? Have you posted the code you are talking about?

a7

1 Like

This makes me think of user @MicroBahner

user Microbahner has written a library that does not DCC-signal but something similar
his library creates mutliple things all based on timer-interrupts

  • stepper-motor-signals
  • servo-signals
  • timers

A servo-signal is similar to your DCC-signal
create once every 0,02 seconds a pulse 1000 µseconds up to 2000 µseconds long.

So setting up a timer-interrupt to process these 45 bits

         10        20        30        40  
123456789012345678901234567890123456789012345
111111111111111110000000110101100110101100001

might be interesting. Or at least he can give some advice how to setup which hardware-timer for bitbanging the 45 bits

You should really post a timing-diagram that explicitly shows how the signal train looks like.

There is still room for interpretation if the next rising edge

  • alternates between 58 and 116 µseconds
    or
  • rises each 116 µseconds
    or
  • if there is some additional time
    that is how long ????
    to indicate the end of a pulse

additionally you should

always

post a complete sketch.

With your code-snippet above I have to scroll back the thread and scratch my head

which other code-snippet show the working version of your function output_bit()????

code-sections have a limited height. So it doesn't matters if your code is 3000 lines long.

There are several ways to use one of the hardware timers to output the needed signal, WITHOUT tying up the processor in numerous micros() delays. That would be FAR more efficient, and FAR more accurate than using micros() to time a bit-banged output., and allowin the processor to contunue doing other things while the bitstream is being output. The Waveform Generation Mode of the 16-bit timers would be but one way to do it.

2 Likes