Maximising uno timer resolution.

I'm trying to work out how to use the arduino uno to time things at its maximum frequency (16MHz), i.e. measure to the nearest 62.5ns. I've done a bit of surface deep research and cobbled together the following code:

#include <avr/interrupt.h> 
#include <avr/io.h>

unsigned long count = 0;   //used to keep count of how many interrupts were fired

//Timer2 Overflow Interrupt Vector, called every 1ms
ISR(TIMER2_OVF_vect) {
  count++;               //Increments the interrupt counter
  TCNT2 = 255;           //Reset Timer to 255 out of 255
  TIFR2 = 0x00;          //Timer2 INT Flag Reg: Clear Timer Overflow Flag
}; 

void setup() {
  noInterrupts();
  
  //Setup Timer2 to fire every 62.5ns
  TCCR2B = 0x00;        //Disbale Timer2 while we set it up
  TCNT2  = 255;         //Reset Timer Count to 255 out of 255
  TIFR2  = 0x00;        //Timer2 INT Flag Reg: Clear Timer Overflow Flag
  TIMSK2 = 0x01;        //Timer2 INT Reg: Timer2 Overflow Interrupt Enable
  TCCR2A = 0x00;        //Timer2 Control Reg A: Wave Gen Mode normal
  TCCR2B = 0x1;        //Timer2 Control Reg B: Timer Prescaler set to 1
  interrupts();
}

void loop() {
}

Note that I copied and pasted some code and played around with the parameters. I guessed that TCCR2B = 0x1 makes the prescaler 1. For the most part, I only know approximately what the code does: The arduino's timer2 increments the value in an 8 bit register, when this overflows an interrupt is fired. Setting the prescaler to 1 and the value in the register to 255 (leaving only one incrementation until overflow) theoretically makes the register overflow every 62.5ns.

I used this code to flash an LED, and determined that I had a resolution of around 500ns; an order of magnitude worse than my target! Is my code incorrect? Is it just less efficient than it could be? Or am I flogging a dead horse; am I ever going to get down to 62.5ns?

My interest in this is purely academic, with a potential end goal of attempting to measure the speed of light. Whatever the case, I want to time as precisely as possible!

Thank you for your help :slight_smile:

JizzaDaMan:
Is my code incorrect? Is it just less efficient than it could be? Or am I flogging a dead horse; am I ever going to get down to 62.5ns?

Yes. Yes. Yes. Yes.

This is a better place to start...

Thanks :slight_smile:

If you want to time things at 16,000,000 ticks per second, don't use interrupts! They take 80 cycles (5 microseconds) just to get started and at least one tick per instruction.

Use the Input Capture feature of Timer1. That will look for an edge on the ICP1 pin (Pin 8) and saves the current value of Timer1 in the ICR1 register. Use the ICR interrupt (which happens AFTER the data is safely stored) to save ICR1 in a variable. Then you can compare that to a future input capture. This will give you intervals with 62.5nS resolution, assuming you are running Timer1 with a prescale of 1. The timer will overflow at 244+ Hz so any interval longer than a couple of milliseconds you will want to track timer overflows as well.