CDI tester project

Here is the bug in the program which required delay(2000) in setup.

int timerTopValue =  0;

When the first TIMER1_COMPA_vector interrupt was triggered, and OCR1A was set to timerTopValue, it was being set to 0 because this interrupt occurred before the first read of the pot to set the period. Because of this 0 value, the subsequent interrupts were occurring so fast that I think the millis() function was blocked from advancing and the code did not enter the section where the pot was read. I'm not quite sure why the delay in setup fixed this.

There are two solutions, simple one is to initialize the timerTopValue with the 12500 count used in the initial setup. You can also change the line in the timer set up to reflect the non zero initial assignment.

//int timerTopValue =  0;
int timerTopValue = 12500; //50 ms with 64 prescaler   // changed from timerTopValue = 0
 // OCR1A = 12500;  //50 ms to top value
  OCR1A = timerTopValue;

The other would be to call an analogRead() in set up and assign the mapped value to RPM and calculate an intitial timerTopValue at that time. With either of these changes, the sketch does not require the delay in set up.