A timing question about microcontrollers

Lets say i am using an arduino or microchip pic series microcontroller. I want to power-up a led with an output every 1 second using timer interrupt. I energized my controller and it started to run. after 1 week or 2 week later, will it continue to make an interrupt for 1 second or will i get any error with my interrupt.? for example, Is it possible to get an interrupt after 1010 milisecond or 1100 ms or 1200 ms...

Thank you.

Most Arduino boards run with a 16MHz crystal. So all the timing is related to that. If you define 1 second now, it will be forever 1 second.

There is more to this...
The microcontroller on the Arduino has hardware timers inside. Those operate on the 16MHz. There are libraries for them, but you can also program them yourself.
The Arduino has default libraries and to make them flexible, some timing functions let the timers generate an interrupt. That interrupt can be delayed by other interrups.

The main reason for a project to "lose time" after running for a long time is that the software is likely to keep track of the amount of time that has gone by in some sort of variable that gets added to after each interrupt. After some time, that variable will "wrap around" from all ones to zero. This doesn't mean that the program WILL break, but it needs to have been written carefully to avoid doing so.

blaxoul:
will it continue to make an interrupt for 1 second or will i get any error with my interrupt.?

I think it would help if you explain what you think might cause the error? (in case we are not dealing with the issue that is worrying you).

...R

Generally when you use millis() to do your timing, you should use delta time stamps in order to avoid rollover issues. Basically that means you should subtract the last millis() time from the current millis() time whenever you check your time interval, and the signed or unsigned arithmetic will compensate automatically when the millis() time rolls over.
Here is code from the BlinkWithoutDelay Example:

void loop()
{
  // here is where you'd put code that needs to be running all the time.

  // check to see if it's time to blink the LED; that is, if the 
  // difference between the current time and last time you blinked 
  // the LED is bigger than the interval at which you want to 
  // blink the LED.
  unsigned long currentMillis = millis();
 
  if(currentMillis - previousMillis > interval) {
    // save the last time you blinked the LED 
    previousMillis = currentMillis;   

    // if the LED is off turn it on and vice-versa:
    if (ledState == LOW)
      ledState = HIGH;
    else
      ledState = LOW;

    // set the LED with the ledState of the variable:
    digitalWrite(ledPin, ledState);
  }
}

This code has the possibility to introduce what I believe the OP is talking about... drift. Your code that does useful work, in the example (//here is where you'd put code that needs to be running all the time)... what happens if it takes longer than one millisecond to execute? For example, previousMillis is 62000, currentMillis was 62999, and the code takes 2 milliseconds to execute, so that the next time the checking code gets executed, currentMillis is 63001. The LED blinks, previousMillis gets set to 63001, and you go merrily on your way. Except that you have lost 1 millisecond. Over time, your code can drift significantly.

You can program around this.

instead of
    previousMillis = currentMillis;
just do
    previousMillis += interval;

Using timer interrupts you can introduce drift as well. If you trigger the timer as a one-shot, and re-trigger it from the timer interrupt, you will add clock cycles every time it fires. That shouldn't happen if you set up the timer to fire repeatedly.