I've been working on a couple standalone clock projects and I am almost ready to wrap them up.
I calibrated them by running them for a certain amount of time, finding how much time they lose, and changing the loop time by a calibration coefficient of sorts.
For example, after 30811 seconds, my clock showed 30729 seconds and I just changed the loop value 30729/30811 = 0.99733..
delay(99); // calibrated delay delayMicroseconds(733); // 99,733us delay... originally 100ms
For learning purposes, I would like to know how much time each instruction takes.
Is there a way to calculate the amount of time each instruction takes per program cycle?