Millis Accuracy Again

There seems to be something terribly wrong with my timekeeping.

I converted the code to use micros() - everywhere in my code where I used millis() I now use micros() / 1000.

I used RealTerm to capture the serial output and timestamp it.

Over a 1408-second period (23 minutes) I have a time deviation of +/- 60 seconds.

This is with the Maxkit32 which has a crystal.

That is a huge time variation. And the Maxkit32 uses the PIC32 core timer, which is supposed to be more accurate than the ISR-based timer in the Atmega2560.