Program scan time calculation

Hello everyone.

I've been working on a couple standalone clock projects and I am almost ready to wrap them up.

I calibrated them by running them for a certain amount of time, finding how much time they lose, and changing the loop time by a calibration coefficient of sorts.

For example, after 30811 seconds, my clock showed 30729 seconds and I just changed the loop value 30729/30811 = 0.99733..

delay(99);                         // calibrated delay
delayMicroseconds(733);        // 99,733us delay... originally 100ms

For learning purposes, I would like to know how much time each instruction takes.

Is there a way to calculate the amount of time each instruction takes per program cycle?

Is there a way to calculate the amount of time each instruction takes per program cycle?

Yes. With a 16 MHz crystal, each instruction takes 62.5 nanoseconds.

Now, all you need to do is figure out which machine instructions correspond to which C statement, and you will know. This is not a trivial thing to do.

The reason for drift is that the crystal/resonator is not all that accurate, and there is no compensation for temperature and humidity.

Don't forget to take temperature and humidity into account in your clock.