The effect of delays on a timer

Hello,
I am working on a project, and its almost done, but I had a question about the accuracy of my project.
I am working on a simple countdown timer that uses two seven segment displays for setting the time.
Now, the code that I am using has two delays 1 ms which are used so that they don't update too fast.
My question is, how much accurate will my timer be if say I want to turn an LED on after two hours. It doesn't have to be dead on the spot. +/-5 minutes is OK.

It depends what else your program is doing, that might interfere with the accuracy of the timekeeping.

I guess you will have a separate routine to check and see if the LED is due to be lit that, when the 2 hours is up, will switch the LED on, and that this will be controlled via a technique similar to the "BlinkWithoutDelay" example sketch in the IDE.

I am also assuming that the routine for setting your displays will run, then the LED Check and Switch routine will run, then the display setting routine as the main loop goes round-and-round.

I would have thought that your use of "Delay" would not put the timing out noticeably -if at all, in practice. You could try putting it all together with timing to make your LED go on after a much shorter time, say, 10 seconds then test it out and see what happens.

Now, the code that I am using has two delays 1 ms which are used so that they don't update too fast.

Which is two more than it should have. Read, understand, AND embrace the blink with delay example's philosophy.

Try this :

void loop() {
delay(1000); // delay in between reads for stability
Serial.println(millis());
}

You can see millis is increasing about 1000 every time even with a delay.
Since millis uses a timer it keeps counting while the delay is on, so you have no problem here.

Use the Time library to keep track of what time it is. This is more accurate than assuming delay(x) will give you accurate timing. It will also let you put other code in there without having to worry that the execution time will throw off your timing.

Every 100ms (if that's fine enough accuracy for you) check the internal clock to see whether it's time to do another update. Also, Arduinos tend to lose time. At least mine does, at a rate of about 1 second every 4-5 minutes. If you care about this, you can interface with a real-time clock.

I used to use the DS1307 chip, but it honestly sucks for anything that has to keep time for over a day, unless it will be refreshed from a computer. The DS3231 costs more, but is FAR more accurate (to within about one minute/year) and you don't have to supply the crystal or any pullup resistors.

The DS1307 comes in a through-hole format that will work on breadboards, whereas the DS3231 requires surface-mount. I got a SOIC-8 breakout from Adafruit, and it works perfectly, although I did have to solder up a watch battery clip to keep the Vbat pin supplied. There are also Amazon sellers who will sell you a complete DS3231 board (with a watch battery on the back) that you can just connect to the power and SPI pins. That way, integration is someone else's problem.

The arduino clock (quartz) could be assumed to have a, say, 1% accuracy. But if you use delay 1ms in your loop you have to count the time of the other instructions before delay as well. If you check the millis timer for an elapsed value of two hours it could be expected to have good accuracy.