faster micros()?

hi everyone,

i've got some code running a stepper motor (throwing a pin HIGH/LOW at equal intervals) and I'm using micros() to check times. I'm concerned about optimized code and was wondering if there is a faster way to access a timer counter than just micros()? I know there is an timer interrupt library floating around out there; but I wonder if it offers any speed increases?

I''m currently doing something like this:

setup() {
delayTime = 100; // time to wait (in microseconds) 
}

loop() {
   time = micros();

   if (time >= previousTime + delayTime) {
      do something because our proper wait interval has passed;
      previousTime = time;
   }
}

it should also be noted; that in order to implement acceleration; the time delayTime will vary. so i'm not certain if interrupts are the way to go...

You can look at the source to micros() and inline it into your code if you want - avoids a function call.
Also you might only need 16bit resolution rather than 32, which will again increase the speed.

It should be in /hardware/arduino/core/arduino/wiring.c

I do not recommend inlining micros(), because I coded it for robustness rather than speed. Inlining it will not gain you much.

Instead, if you are guaranteed that you will always be measuring strictly less than a millisecond, you can read the SysTick counter directly. The counter counts down from 83999 to 0 with each clock tick. You use it a bit like micros(), except that you have to handle the wraparound explicitly, and because it counts down the subtraction is the other way around. You could divide the result by 84 to get microseconds, but if you are after a specific delay would be better to multiply the delay you require by 84 to get it in clock ticks.

Here is an example:

// SysTick example by stimmer

void setup() {
  Serial.begin(115200);
}

void loop() {
  
  int r=random(200);
  int v=SysTick->VAL;
  delayMicroseconds(r);
  v=v-SysTick->VAL;
  if(v<0)v+=84000;
  Serial.print("delayMicroseconds(");
  Serial.print(r);
  Serial.print(") took ");
  Serial.print(v);
  Serial.print(" clock ticks, which when divided by 84 equals ");
  Serial.println(v/84);
  delay(500);
  
}
delayMicroseconds(169) took 14205 clock ticks, which when divided by 84 equals 169
delayMicroseconds(132) took 11097 clock ticks, which when divided by 84 equals 132
delayMicroseconds(117) took 9837 clock ticks, which when divided by 84 equals 117
delayMicroseconds(47) took 3957 clock ticks, which when divided by 84 equals 47
delayMicroseconds(172) took 14457 clock ticks, which when divided by 84 equals 172
delayMicroseconds(68) took 5721 clock ticks, which when divided by 84 equals 68
delayMicroseconds(180) took 15129 clock ticks, which when divided by 84 equals 180
delayMicroseconds(23) took 1941 clock ticks, which when divided by 84 equals 23

stimmer:
I do not recommend inlining micros(), because I coded it for robustness rather than speed. Inlining it will not gain you much.

Instead, if you are guaranteed that you will always be measuring strictly less than a millisecond, you can read the SysTick counter directly. The counter counts down from 83999 to 0 with each clock tick. You use it a bit like micros(), except that you have to handle the wraparound explicitly, and because it counts down the subtraction is the other way around. You could divide the result by 84 to get microseconds, but if you are after a specific delay would be better to multiply the delay you require by 84 to get it in clock ticks.

Thanks for the advice stimmer; that gave me a lot to think about.

Taking your cues; I've written something like this:

time = GetTickCount() * 1000 + (84000 - SysTick->VAL) / 84;

in order to get elapsed time instead of using micros().

Now, I understand it's very close to the code for micros(), but it's slimmed down.

I'm hesitant on whether this offers any speed savings though; doing a naive print statement at 115200baud to check times, shows the same time required between both ways of checking time. i wonder if this shouldn't be check with a scope though. thoughts?

That won't always work - there's a small chance of the tick overflow happening between GetTickCount() and the read of SysTick->VAL, making the result off by 1000. (There's a whole thread on this somewhere)

A call to micros() should take less than a microsecond.

Can one set the SysTick value with something like
SysTick->VAL = 0;
to synchronize the RTC second tick to a millisecond
and microsecond timer?

I want to be able to read RTC and a coordinated milliseconds
to get date, and a properly coordinated hh:mm:ss:mmm.

I was going to use a timer that counts milliseconds, resetting the
timer to zero whenever I get a seconds-changed interrupt from
the RTC.... Assuming that can be done in the Due.

Yes, I know that the RTC is reset with each Reset, and each
power off/on. Still, in logging data I need second and msec
time stamps, and some date and hh:mm

Although awkward, the user could set the RTC after power up, if needed.