Time elapsed reported by arduino incorrect?

Hello Everyone, I am having an issue with the elapsed time that is being reported in my program. I am running arduino17 and using a modern device rbbb. I have some code that I am modifying. The code drives a stepper motor controller in a sin wave by pulsing the step pin at varying frequency. After going through some gearing, it produces a linear up and down motion that is sinusoidal. The delay between steps is read from an array of delays I allocate at the beginning.

I have two variables time1 and time2. I record these variables during an interrupt that is triggered by an ir beam break switch that is broken as the plunger reaches peak amplitude. My issue, is that the time I get when I

Serial.print((float)(time2-time1)/1000);

is not the same value I get when I record the time via a stopwatch. Lets say I record the period as 11 seconds on my stopwatch, the arduino reports a time of approximately nine seconds.

I was curious if someone else can spot where I am losing time? If anything I figured all of my additional overhead would result in my arduino recording a period longer than observed.

I am using a resonator and not a crystal. But I was under the impression that the error in timing associated with the resonator shouldnt be anything near 2-3 seconds off after just 11 seconds.

I attach the main code below. If any other information would facilitate debugging my issue just ask.

Any and all help is much appreciated.

The main loop snippet of my code looks like this

void loop() {
  
  // read the analog in value:
    checkChange();
  ampValue=analogRead(analogamp)/50+10;
   
  //MOTOR RUNNING CODE GOES HERE.  ABOVE LOOP ONLY RUNS ON CHANGE AND SWITCHES SPEEDS  
  


    for (i = 0; i < steps; i = i++) {
    tc_interval[i] = (int)(t_interval[i]*outputValue); //outputValue used to be time
    }
   // running = true;
    //  while(running){

        digitalWrite(directionPin, HIGH);

        for (j = 0; j < steps; j = j++) {
          t = tc_interval[j];
          for (k = 0; k < ampValue; k = k++) {  //k multipier for amplitude
          digitalWrite(stepPin, HIGH);
          delayMicroseconds(t);
          digitalWrite(stepPin, LOW);
          delayMicroseconds(t);
          }
        }

checkChange();

        digitalWrite(directionPin, LOW);

        for (j = 0; j < steps; j = j++) {
          t = tc_interval[j];
          for (k = 0; k < ampValue; k = k++) {  //k multipier for amplitude
            digitalWrite(stepPin, HIGH);
            delayMicroseconds(t);
            digitalWrite(stepPin, LOW);
            delayMicroseconds(t);
          }
        }

  
  //}
  

  
 
  
  
} //end main loop


void inter_time() {
  if(wheretime==0)
  {
   time1=millis();
       Serial.print(13,BYTE);
   Serial.print("MM");
 
  wheretime=1;
  }
  else
  {
    time2=millis();
    wheretime=0;
    //random spot 017 005 003
    Serial.print(13,BYTE);
    Serial.print("T2: ");
    Serial.print((float)(time2-time1)/1000);
    //Serial.println((float)(time1-time2)/1000);
  }
  
}

Where in this code are time1 and time2 declared? Where are time1 and time2 set?

There is a function, inter_time, that modifies time1 and time2, but that function isn't called.

You are subtracting time1 from time2, then dividing by 1000. There is the possibility that you are loosing as much as just shy of a whole second there.

Hello. I declare time1 and time2 in the header as

volatile unsigned long time1; //period logger 1
volatile unsigned long time2; //period logger 2
volatile int wheretime=0;

I call the function inter_time using an interrupt that is attached in my setup function as

void setup() {
  attachInterrupt(0,inter_time,RISING);
  pinMode(stepPin, OUTPUT);
  pinMode(directionPin, OUTPUT);
  
  Serial.begin(9600); 
  delay(500);

  Serial.print( 12, BYTE); // clear lcd
  Serial.print(  4, BYTE); // turn off blinking cursor
  Serial.println("Reading 0 1 2");
  Serial.println("0:");
  Serial.println("1:");
  Serial.print("2:");

}

How do I lose approximately 1 second in my time2-time call. I was under the impression that my operation (float)(time2-time1)/1000 first took the difference in my times and then cast it to a float before dividing by 1000? Perhaps I misunderstand. Also, I previously output time1 and time2 via serial.print and subtracted them manually without doing the division and the time returned is the same as my (float)(time2-time1)/1000 call. Am I incorrect in this operation?

And I will add. Quite the weather we are having the last few weeks.

I am in seattle too.

I was curious if someone else can spot where I am losing time?

I cannot say if this is what's causing the problem but you have too much going on in your interrupt service routine.

Accessing Serial (especially at 9600 baud) inside an ISR can cause problems...

Serial.print(13,BYTE);
Serial.print("MM");

Performing a floating-point operation in an ISR can also cause problems...

Serial.print((float)(time2-time1)/1000);

The ISR should record millis and set a flag. In loop, check for the flag then do all the Serial and floating-point stuff. Be sure to protect the flag, time1, and time2 with a critical section.

Noted. I can remove those bits of code from the interrupt quite easily. In a previous iteration of this code, instead of triggering the time1 and time2 variables via a mechanically fired interrupt, I simply said time1=millis() at the beginning of the first loop and then declared time2=millis() at the end of the second big ugly loop. IE i called millis at the points where the interrupt should be firing and the time reported was slow as well. Perhaps this provides some additional info? Even without an ISR at all, my times were still recorded low.

Thanks for the input.

B

I was under the impression that my operation (float)(time2-time1)/1000 first took the difference in my times and then cast it to a float before dividing by 1000?

The time difference is computed, then divided by 1000. The result is cast as a float. Because time1, time2, and 1000 are all integers, integer arithmetic is performed, resulting in truncation. The truncated result is cast to a float.

You might be able to divide by 1000.0, which is a float, instead of an int. But, better would be to store the time2 - time1 result as a float, first. There's no need to cast it, then. After storing delta time as a float, then, divide by 1000.0.

Quite the weather we are having the last few weeks.

It sucks. Beats ice and snow, though.

But does it not indicate that the operation is working ok given that if I print out time1, and then print out time 2, then subtract these values in my trusty handheld calculator, the # of milliseconds I get corresponds to the value displayed by (float)(time2-time1)/1000. IE

time 1 :400713 time 2:407340

so subtracting i get 6627

and the time displayed by (float)(time2-time1)/1000 = 6.63

?

thanks,

This works for me; 1.52 is printed...

volatile unsigned long time1; //period logger 1
volatile unsigned long time2; //period logger 2
volatile int wheretime=0;

void inter_time() 
{
  if ( wheretime == 0 )
  {
    time1 = millis();
    wheretime = 1;
  }
  else
  {
    time2 = millis();
    wheretime = 0;
  }
}
 
void setup( void )
{
  Serial.begin( 9600 );
}

void loop( void )
{
  inter_time();
  delay( 1515 );
  inter_time();
  Serial.println( (float)(time2-time1)/1000 );
}

I guess that the cast operation has a higher precedence than the divide operation. I thought that the cast operation was performed after the division.

Ok, so more testing. I have removed the interrupt code. So as a test similar to what CodingBadly put I place this string in my main and it returns

  time1 = millis();
delay(3200);
  time2 = millis();
Serial.print((float)(time2-time1)/1000);

This call returns 3.20 as I would expect.

Next, I took my code and rewrote my main as

void loop() {
  
  // read the analog in value:
    checkChange();
  ampValue=analogRead(analogamp)/50+10;
   
  //MOTOR RUNNING CODE GOES HERE.  ABOVE LOOP ONLY RUNS ON CHANGE AND SWITCHES SPEEDS  
  
  time1 = millis();


    for (i = 0; i < steps; i = i++) {
    tc_interval[i] = (int)(t_interval[i]*outputValue); //outputValue used to be time
    }
   // running = true;
    //  while(running){

        digitalWrite(directionPin, HIGH);

        for (j = 0; j < steps; j = j++) {
          t = tc_interval[j];
          for (k = 0; k < ampValue; k = k++) {  //k multipier for amplitude
          digitalWrite(stepPin, HIGH);
          delayMicroseconds(t);
          digitalWrite(stepPin, LOW);
          delayMicroseconds(t);
          }
        }

checkChange();

        digitalWrite(directionPin, LOW);

        for (j = 0; j < steps; j = j++) {
          t = tc_interval[j];
          for (k = 0; k < ampValue; k = k++) {  //k multipier for amplitude
            digitalWrite(stepPin, HIGH);
            delayMicroseconds(t);
            digitalWrite(stepPin, LOW);
            delayMicroseconds(t);
          }
        }

  
  time2 = millis();
Serial.print((float)(time2-time1)/1000);
  //}
  

  
  // Serial.print( 13, BYTE);
  // Serial.print("time 1 :");
  // Serial.print(time1);
  // Serial.print(" time 2:");
 //  Serial.print(time2);   
 //  Serial.print("Per =");
  // Serial.print((float)(time2-time1)/1000);
  // Serial.print(" s A: ");
  // Serial.print(ampValue);  
  
  
} //end main loop

To make it even simpler. I disconnected my motor driver and simply plugged an LED into my what I am calling my directionPin. So this pin flashes on and off at the same period the motor driver. The time displayed by Serial.print((float)(time2-time1)/1000) is 6.62. Yet if I take a stopwatch and start it when the led turns on and then stop it when it turns on the second time, which should be one entire period. The stopwatch shows a time of 8.4 seconds. Now I expect that there is some delay in my hand eye coordination, but no way am I off by two whole seconds in my stopwatch measurement.

Appreciate all the help.

B

Your Sketch calls these functions...

checkChange
analogRead
millis
digitalWrite
delayMicroseconds

Lot's of people (including me) use these functions without any mystery clock problems...

analogRead
millis
digitalWrite
delayMicroseconds

That leaves...

checkChange

Which isn't a member of the Arduino libraries. For further help please post the entire Sketch including checkChange.

Ok, I can remove checkChange() with no big issue. All it does is perform an analog read. As further testing I ran the following scripts. Perhaps someone else could run the following and post the results.

Via the documentation the largest accurate delayMicro() is 16383. So
I ran the following code.

  time1 = millis();
delay(1200);
  time2 = millis();
  Serial.print("delay 1200 mili = ");
Serial.println(time2-time1);

  time1 = millis();
delayMicroseconds(16383);
  time2 = millis();
  Serial.print("delay micro 16383 = ");
Serial.println(time2-time1);

  time1 = millis();
delay(16);
  time2 = millis();
  Serial.print("delay 16 milli = ");
Serial.println(time2-time1);


  time1 = millis();
  for (k=0; k<1000; k=k++) {
delayMicroseconds(10000);
  }
  time2 = millis();
  Serial.print("delayMicro loop = ");
Serial.println(time2-time1);

In the first section I delay(1200) and then print out the ms difference between time 1 and time 2. The output to serial monitor is

delay 1200 mili = 1201

Now I run the second section, which is the largest accurate delayMicroseconds(16383); The output on serial monitor is.

delay micro 16383 = 1

16383 us should be 16ms.

To test the regular delay the output from delay(16) is
delay 16 milli = 17

about what it should be.

When I run the loop over delayMicroseconds(10000); The printed out time on serial monitor is

delayMicro loop = 1049

Yet, the delay between the serial output of the previous entry and this one as measured via my stopwatch is about 10 seconds. Which it should be as 10ms * 1000 ~ 10 s.

It appears to me, as though the timer that controls millis() is being stopped as I execute the delayMicroseconds();

Can someone else execute the above code and see what the reported values are?

delay 1200 mili = 1201

delay micro 16383 = 1

delay 16 milli = 17

delayMicro loop = 1049

Thanks,
Brian

Ah! Now it makes sense.

Two important things from here...
http://www.arduino.cc/en/Reference/DelayMicroseconds

For delays longer than a few thousand microseconds, you should use delay() instead.

To ensure more accurate delays, this functions disables interrupts during its operation, meaning that some things (like receiving serial data, or incrementing the value returned by millis()) will not happen during the delay. Thus, you should only use this function for short delays, and use delay() for longer ones.

Anything over 1000 (1ms) is probably going to cause problems. Anything over 2000 (2ms) is definately going to cause problems.

You have two choices...

  • Make your own delayMicroseconds that does not disable interrupts
  • Use delay for anything over 1000us

The anoying part is that although delayMicroseconds blocks interrupts in its delay loop it does not really achieve the aimed for precision. The reason is that once the micro seconds have elapsed it will enable interrupts before it will return. Since all pending asynchronous interrupts will then trigger immediately the actual delay may be much longer than intended.

Bottom line: delayMicroseconds can screw you up unexpecteldy.

Cheers, Udo

Thanks guys,

looks like its time to restructure some code.

Brian