I have a sketch which results in the attached scope images, captured on pin 13.
I am toggling the pin with the attached code.
When I set the period to 1ms, I get a + or - shortened pulse every ~43ms.
I do realize if I want to get 1ms (or less), I should use micros and a value of 1000UL for waitMillis.
What I am trying to understand is when I use the sketch as is, why do I get pulses seen every ~43ms?
struct timer
{
unsigned long lastMillis;
unsigned long waitMillis;
bool restart;
bool enableFlag;
bool CheckTime() //Delay time expired function "CheckTime()"
{
//is the time up for this task?
if (millis() - lastMillis >= waitMillis) //for shorter than 10ms, use micros()
{
//should this start again?
if(restart)
{
//get ready for the next iteration
lastMillis += waitMillis;
}
//time was reached
return true;
}
//time was not reached
return false;
} //END of CheckTime()
}; //END of structure timer
//******************************************
//Create all my timer objects and initialize
//******************************************
timer pin13 =
{
0, 1UL, true, true //lastMillis, waitMillis, restart, enableFlag --> use micros() then 1000UL for waitMillis to get 1ms delay
};
//***************************
const byte Pin13 = 13;
//**********************************************************************
void setup()
{
Serial.begin(9600);
pinMode(Pin13,OUTPUT);
digitalWrite(Pin13,LOW);
} // >>>>>>>>>>>>>> E N D O F s e t u p ( ) <<<<<<<<<<<<<<<<<
void loop()
{
//***************************
//toggle Pin13 every 1ms
if (pin13.CheckTime())
{
//Toggle
digitalWrite(Pin13,!digitalRead(Pin13));
}
} // >>>>>>>>>>>>>> E N D O F l o o p ( ) <<<<<<<<<<<<<<<<<
I seem to recall that millis() does not necessary count in increments of on mSece. If you print the millis() value at a high rate, I think you'll see it jump by two, as a side-effect of hardware timer rolling over every 2^n clocks of the 16Mhz oscillator frequency.
millis() accuracy . . .
Since 125 / 3 is 41.67 this happens every 42 overflows or so. (And since we are adding one every 42 overflows, that means we are adding one every 42 * 24 µS, which is every 1008 µS).
If you are timing small intervals, micros() will be much more accurate, as that reads from the hardware directly, and does not suffer from this creeping error. However it wraps after around 71 minutes. Also, micros has a resolution of 4 µS (not 1 µS) because of the way the timer is configured. (It counts to 256, using a prescaler of 64. One clock cycle is 62.5 nS, thus it "ticks" every 4 µS, and overflows every 256 * 4 µS).
LarryD:
A bit of drift in millis() is adding up and in ~43ms a slip occurs giving a true twice in 2 successive loops().
Hence the 30uS short pulse.
The millis() function is NOT counting up always by one in one millisond.
What the millis() function is doing: I counts up by 1000 in 1000 milliseconds.
But in fact the function is sometimes counting up by one (most cases) and sometimes counting up two (some cases) within one millisecond.
After 1000 milliseconds have passed, the counter has counted 1000 and everything is fine.
This is caused, because the internal counter works with fractions of 1024 and not fractions of 1000, so the millis() function sometimes have to include a "leap millisecond" and counts up 2 at once to stay in sync with the real time passing by.
Or put another way the timer0 interrupt happens every 1.024ms, but jumps 2 every so often to avoid
becoming more than 1.024ms late. If you want more accurate timing than +/- 1ms, use micros().