I'm programming a countdown timer for our schoollessons. So the students can see how much time is over in a lesson.
The lessons are 45 minutes, the pauses 15 minutes.
Now, I start with a number of 2700000 milliseconds and every 1000 milliseconds I substract 1000 from that starting number. I use therefor the blink-without-delay-method.
Very simple, but it works.
One thing: the whole loop takes a bit of time too. So, when I countdown a second, it will be a bit more then a second, i think?
Would it be more accurate when I 'm using an RTC? Or is the deviation small enough for one whole day?