Better to use timer interrupt or millis()?

Hello,

I'm building a robot and need to measure the time between each pulse of the encoder. Right now I'm using a Timer Interrupt with this DueTimer library. Is there an advantage in using a Timer Interrupt over the built in millis() function?

Using millis() is generally simpler but whether it is suitable depends on the length of time you need to measure. I would not try to measure anything less than 50 millisecs with millis(). Below that I would use micros(). However micros() increments in steps of 4 and if you need a finer resolution that is not suitable.

If you need more help please provide a full description of the project you are creating and post your program.

...R

doesn't it matter how accurate the time needs to be and what other processing is occurring that may delay any measurement.

if you're using an interrupt to detect encoder events, it's not difficult to capture a timestamp for those events which may be processed a bit later.

if you're polling encoder events, it doesn't sound like accuracy is very important.

Better to use timer interrupt or millis()?

Neither, you need to use an input interrupt and micros(). An input interrupt triggers when a pin changes,
micros() returns a pretty accurate microsecond timestamp. This way you'll get the time between pulses accurate
to a few microseconds if you subtract the previous value from the current one.

Look up attachInterrupt().