Go Down

Topic: reset millis() ? (Read 5055 times) previous topic - next topic

macsimski

Hello all,

is it possible to reset millis() to zero? because millis() will overflow in about 9 hours, it is better to let it go to zero in a controlled enviroment at a convenient time is stead of in the middle of a calculation.

--
"We're all in this together..."

CosineKitty

Instead of using millis(), you could use timer0_overflow_count directly.  Just add this toward the top of your sketch:

   extern volatile unsigned long timer0_overflow_count;

The only tricky bit is that it will not be quite milliseconds.  Each increment of timer0_overflow_count is 125/128 of a millisecond, or 0.9765625 milliseconds.  But the nice thing is that it will take a lot longer for this to overflow:  48.5 days by my calculations.

You would also have to avoid any runtime function that uses millis(), such as delay().

mellis

If you add the line that CosineKitty suggested, you could just stick a:

Code: [Select]
timer0_overflow_count = 0;

in your code somewhere.  And delay() should still work okay.

CosineKitty

To go off on a tangent a little bit, I wondered when  I saw the source code that functions like millis() did not disable interrupts while accessing timer0_overflow_count:

Code: [Select]

unsigned long millis()
{
   // (comments omitted)
   return timer0_overflow_count * 64UL * 2UL / (F_CPU / 128000UL);
}


Because it requires multiple atmega8 instructions to fetch the value of a 4-byte long integer like timer0_overflow_count, isn't it possible for a timer interrupt to occur in the middle of those instructions and mess up the result?  I wonder if the following might make the code safer:

Code: [Select]

unsigned long millis()
{
   // (comments omitted)
   unsigned long safe_copy;

   cli();    
   safe_copy = timer0_overflow_count;
   sei();

   return safe_copy * 64UL * 2UL / (F_CPU / 128000UL);
}


Likewise, to reset the timer to zero, it might be better to do:

Code: [Select]

cli();
timer0_overflow_count = 0;
sei();

macsimski

so, to be on the safe side, i would use

Code: [Select]

extern volatile unsigned long timer0_overflow_count;

in my declarations

and reset it with

Code: [Select]

timer0_overflow_count = 0;


anytime i want in my code.
--
"We're all in this together..."

mellis

Yep, that should work.  

tateu

i guess this is kind of the same topic: any idea how to get a "time" reading much more precise than millis()? i'm trying to get a time difference between 2 inputs but would need to be able to measure microseconds, rather than milliseconds. is there a register i can read directly? thanks.

mellis

You could speed up the timer that's incrementing the millis().  They won't be milliseconds any more, of course.

Putting this line in your setup should make it run 8 times as fast:

Code: [Select]
TCCR0 &= ~(1 << CS00);

Or this one should speed things up by a factor of 64:

Code: [Select]
TCCR0 &= ~(1 << CS01);

Don't use them both, though, or you'll turn the timer off completely.

For the ATmega168 (Arduino Mini), replace the TCCR0 with TCCR0B.

tateu

thanks mellis. i will try the 64 factor. ideally i would be able to tap into the microseconds. i was thinking that since there is a delaymicroseconds() function, there has to be a microseconds counter somewhere... is that the case? anyway, thanks for your answer.

mellis

The delayMicroseconds() just uses a busy wait loop of the right duration; there's no microsecond counter.

tateu

ok, thanks. i'll try your trick to accelerate by 64 then. cheers!

Go Up