Hi guys.
I have a sketch which acts as a high accuracy pulse delay unit running on a Nano ATMega328P (old bootloader). I am using the high accuracy timer2 library of Gabriel Staples, a concise and really useful piece of coding. That claims accuracy to 0.5us of course.
It has a delay hard coded in and sets up a delay count beyond the start point when it is run up. According to the Timer2 library that should be simply [Desired Time (us) * 2] and I reckon there will need to be a small fixed correction for the other small bits of processing performed outside the loop which is pretty much constant and not dependent on the target delay at all.
The input and output pulses are buffered by simple single transistor amps, pulldown in the case of the input (FALLING) and pullup for the output as it will eventually have to drive optoisolated equipment and the current needs of an LED/resistor input are better served from a PNP collector than through the Rload of an NPN. It is triggered by interrupt on an I/O port and basically just calculates the final target count from the start count and desired delay then starts looping, reading the count over and over and doing nothing else at all until the target count is exceeded. I did anticipate that each iteration of the loop would take a certain number of clock cycles and would be pretty much a constant overhead for each read. When the target count is exceeded it fires an output pulse (FALLING) to the output transistor and reports back on the diagnostic serial line with a final count total so I have an idea of how close to the desired delay I am.
This is working perfectly and it reports back with various stats I have programmed in reporting of the average error beyond the target count as the pulse count builds up, (I used 100 tests). All of the reporting is outside the delay and pulse firing code. It reports that I am getting around an average of 2.7us accuracy for a delay of 1000ms. The error varies at random between 0us min - 9us max.
Now the oddity. I am trying to check this against my scope, (a GW Instek GDS1052U, basic but workmanlike). That is saying the pulse delay of 1000ms is out by about 300us. I need this to be reliably better than 50us which the stats suggested it was easily beating. I am loathe to believe that sort of delay is inherent in a couple of simple single transistor stages so I'm looking for other reasons for the discrepancy. The scope could be one of them of course but I would have thought that a digital scope of this tiype would be pretty much on the beam with this sort of measurement.
Does anyone out there have an idea of how close to exactly 0.5us per count the high accuracy timer2 library achieves?