I am working on a clock movement that is powered using a stepper motor and the GSM2 driver board. The delay between steps needed to achieve 1 rpm on the second hand works out to 25 ms. In my current setup, I am sending pulses to the driver board using the delay() function. This works reasonably well, although it is not accurate enough for me. To remedy this problem, I ordered a DS3231 RTC module, thinking I could use that to replace delay()/millis(). However, I can't figure out how to achieve millisecond resolution using the RTC. It seems like it should be strait forward given the oscillator on board the RTC is 32kHz. I am new to arduino programming and am out of my comfort zone with this one.
Does anyone know of a way to achieve the 25 ms pulses we need while still retaining the accuracy of the RTC? Is it possible to use the RTC to correct the output produced by the built in delay()/millis() function? How would you then translate that output to pulses sent to the stepper motor driver board?
Internet research produces some mixed results on whether or not this is possible. I have yet to see an example of a clock with millisecond resolution. The theory behind it makes sense, but I don't know where to start.
When timing such short intervals (only 25 ms or so), I recommend that you use micros() rather than millis(). This is because micros() will allow you to track the intervals more precisely.
I have built several Arduino-based clocks, and your problem is similar to one I have had to solve. There are several ways to solve this problem; I will tell you my way of solving it. Other people might recommend different solutions.
For measuring very short lengths of time, the micros() timer works well; however, it might run a tiny bit too fast or too slow. After a few minutes, the inaccuracy of micros() will start to matter.
The DS3231 does an excellent job of keeping track of days, hours, minutes, and seconds, but there is no straightforward way to read fractions of a second from it. (Well, there are ways to work around this, as you can see on Page 13 of the datasheet, but I've never done it that way.)
As for your clock:
I would declare a variable (probably an unsigned long
) for the number of microseconds between steps. I would initialize this variable to 25000. I would then use something similar to Blink Without Delay to make the steps happen at the chosen interval. I would also keep count of the number of steps (and, for that count, I would need another variable), and I would use that count to calculate the number of seconds that the clock hands had advanced.
Every so often (maybe once every 30 seconds or so), I would check the time according to the DS3231. If, for example, it turned out that the clock hands had advanced 30 seconds in 29 seconds real time, I would know that the clock was running too fast, and I would slow it down a bit. (I might make the interval between steps be 25500 microseconds rather than 25000.)
If it turned out that the clock hands had advanced 30 seconds in 31 seconds real time, then I would know that the clock was running too slow, and I would speed it up a bit. (I might make the interval between steps be 24500 microseconds rather than 25000.)
I’d trying using the DS3231’s SQW output pin to provide a 1 Hz interrupt to your processor. In the ISR, note the elapsed time between interrupts as measured by internal millis(). Depending on what processor you have, millis() may not update in the ISR, but it will tell you the correct value when the interrupt occurred. Also, set a flag telling your main code that an updated value is available.
In your main code, use this value to determine how many millis() “ticks” is equal to 25ms of real time. The interrupt will give a new value for this every second in case your processor’s oscillator drifts.
EDIT:
Per @odometer's post above, use micros() for this rather than millis(). Same idea though. The RTC interrupt will give you a constantly-updated count of how many micros() "ticks" are equal to a second of real time.
You could also average this over some time period to smooth out the adjustments (low pass filter).
gfvalvo:
I’d trying using the DS3231’s SQW output pin to provide a 1 Hz interrupt to your processor. In the ISR, note the elapsed time between interrupts as measured by internal millis(). Depending on what processor you have, millis() may not update in the ISR, but it will tell you the correct value when the interrupt occurred. Also, set a flag telling your main code that an updated value is available.
In your main code, use this value to determine how many millis() “ticks” is equal to 25ms of real time. The interrupt will give a new value for this every second in case your processor’s oscillator drifts.
And, for goodness sake, do not use Serial.print
inside the ISR!
If you find yourself wanting to use Serial.print
inside an ISR, you should instead take the value you wish to print, and store it to a variable. Then, once you're outside the ISR, it will be perfectly OK to Serial.print
to print the value of that variable.
(Is there an FAQ on exactly what is, and what isn't, OK to do inside an ISR?)
gfvalvo:
Per @odometer's post above, use micros() for this rather than millis(). Same idea though. The RTC interrupt will give you a constantly-updated count of how many micros() "ticks" are equal to a second of real time.
How well does micros() work in an ISR? Does it glitch or move backwards occasionally? Or can we count on it to only count forwards, even in an ISR?
Actually, what is the nature of our 1 Hz signal? Is it HIGH for 0.5 second, and then LOW for 0.5 second? If it is, then it is "slow" enough that there is no need for an ISR. But, I don't know how much time it spends HIGH and how much LOW.
Good point. I have seen that type of micros() behavior in a SAMD chip before. I solved it by adjusting the interrupt priorities in the ARM’s NVIC. Since AVR’s interrupt system is much simpler, it may not be a problem. The DS3231's SQW output is a square wave. So, you could indeed accomplish something similar by sampling.
But, I’d probably try it with the ISR first. As a test, I’d increase the SQW frequency to its 8.192 KHz maximum and then write some code to check for micros() glitches and such. Let that run for a couple hours (or days) to get confidence in the technique.
gfvalvo:
But, I’d probably try it with the ISR first. As a test, I’d increase the SQW frequency to its 8.192 KHz maximum and then write some code to check for micros() glitches and such. Let that ran for a couple hours (or days) to get confidence in the technique.
... which, for the OP, would be overkill. And probably way over his head, as well.
Another way would be to time the stepper motor's steps so that the clock hands advance 1 second in (let's say) 0.97 second real time, then pause until it is time for the next second to happen.
The idea is from here:
the only difference being, the clocks in the video have the sync happen once per minute, whereas my method has the sync happen once per second.
I like this idea. Hopefully, if the difference in delay is small enough, the clock hands will still run smoothly.
This is what I came up with and it appears to be working, for now. I just need to leave it for a while to test the accuracy.
rtc.ino (1.3 KB)