What I want to measure basically is a phase between two saw-tooth shaped signals (see below):

1. first signal - with discrete values 0..9 (rtc_sec MOD 10),

2. the second one - a "linear" 0.000 .. 9.999 ( (mil_sec MOD 10000) / 1000).

My working assumption to the process is following:

Let us have stable RTC and millis signals. When we sample both signals (we get rtc_sec and mil_sec values) ANYTIME, the difference (rtc_sec - mil_sec) shall be constant. In order to "normalize" both signals (as the rtc_secs run 0..59) we do MOD with the signals to get them into the same range. When we provide a moving average over a large number of differences (after the MOD, and even the first signal has discrete values), the result shall be constant.

The red colored are sampling times (for example). What I do is simply the difference between the values sampled, and the diff is then averaged by a moving average filter with a "time constant" of about ~400 sampling periods. Not sure I do understand the results yet, however.. (moreover the algorithm is just my first guess...)

PS: corrected MOD 10, and MOD 10000 (for 0 .. 9 and 0.000 .. 9.999).