I'd like to be able to measure how long an ISR takes to execute.
I'm using two quadrature encoders on two 12V DC motors. These.
Encoders are 64 counts per rotation and with a 29:1 gearbox that's 1856 counts per wheel rotation.
In my setup the wheels can spin to about 3 rotations per second. So that's 1856 x 3 (rps) x 2 (wheels) = 11,136 interrupts per second maximum. Let's say 10,000 / s.
I wrote the interrupts to be as short as possible, they basically do only one command (check status on two pins).
However I'm concerned that even if they are short, they will eat up processor time if they are called every 0.1 ms.
My main loop needs to run on a fixed time (10ms).
The way I understand ISR, if I was able to check millis() right before and after an ISR call, it would show the same, even if the ISR took 500ms to run. IS this correct? Or does the counter "catch up" after exiting the ISR and shows the correct millis() again?
Assuming I'm correct with my assumption in 0. If I want my main loop to run every 10 ms, but interrupts take say 10ms every loop, in effect the loop will take 20ms (even if the processor thinks it's been 10ms). I have no idea how I could avoid this. And similarly,
If ISR indeed pause the update of millis() then my main loop time will be dependent on how many interrupts were called during 1 loop cycle, that is, dependent on wheel speed. How can I avoid that?
Is it possible to check the time taken by an ISR? (separate question, just to be able to otpimise my ISR code as much as possible)
Maybe I'm all wrong and I'm inventing a problem, but I just want to make sure I'm not doing anything dumb. Thanks for your help.