Reimplementing timing functions for high clock speed chips

I am moving away from Arduino environment in my recent projects, but I still find the timing functions millis(), micros(), delay() and delay_ms() useful. However it isn’t always easy to port those functions over to another platform, or to understand the internal working at all.

So here is my not-so-trivial implementation for some high-clock-speed chips. Blinky implemented using this requires working interrupt handling and timer to work, hence being able to serve as a good sanity check for new chips.

Example 1: ATtiny85, running at 16MHz with its internal 64MHz PLL for both system clock and Timer1:

/*
 * clock.c
 *
 * Created: 2016/9/8 0:28:51
 *  Author: max
 */ 

#include "clock.h"
#include <avr/wdt.h>

void clock_init(void)
{
	// Wait for PLL lock. (we don't have to enable it as it is system clock)
	while (!(PLLCSR & _BV(PLOCK))) wdt_reset();

	// Enable PLL as clock source for Timer1
	PLLCSR |= _BV(PCKE);

	// Start Timer1
	TCNT1 = 0x00;
	TCCR1 = 0x01;

	// Begin Timer1 interrupt
	TIMSK |= _BV(TOIE1);
}

volatile uint16_t millis_counter = 0;
volatile uint16_t micros_counter = 0;

ISR(TIMER1_OVF_vect)
{
	micros_counter += 1000000 / (F_CPU * 4 / 256);
	if (micros_counter >= 1000)
	{
		millis_counter++;
		micros_counter -= 1000;
	}
}

uint16_t millis(void)
{
	cli();
	uint16_t ms = millis_counter;
	sei();

	return ms;
}

uint16_t micros(void)
{
	cli();
	uint16_t ms = millis_counter;
	uint16_t us = micros_counter;
	uint8_t count = TCNT1;
	sei();

	return ms * 1000 + us + (count / (F_CPU * 4 / 1000000));
}

void delay(uint16_t time)
{
	int16_t end = millis() + time;
	while (end - (int16_t)millis() > 0) wdt_reset();
}

void delay_us(uint16_t time)
{
	int16_t end = micros() + time;
	while (end - (int16_t)micros() > 0) wdt_reset();
}

Example 2: PIC18F45K20, running at 64MHz using internal oscillator and PLL, clock on timer 2:

#include "system.h"

void clock_isr(void);

void clock_init(void)
{
    // Because we are running off HSINTOSC, we need to kick the clock frequency
    // up to 16MHz before turning on PLL.
    OSCCONbits.IRCF = 0x7;
    
    // PLL on. CPU run at 64MHz
    OSCTUNEbits.PLLEN = 0x1;
    
    // Switch on Timer 2 interrupts
    TMR2_ISR = clock_isr;
    TMR2IF = 0;
    TMR2IE = 1;
    
    // Switch on Timer 2 at F_IO = 16MHz
    TMR2 = 0;
    PR2 = 0xff;
    T2CONbits.T2CKPS = 0;
    T2CONbits.T2OUTPS = 0;
    T2CONbits.TMR2ON = 1;
}

volatile uint16_t millis_counter = 0;
volatile uint16_t micros_counter = 0;

// Fires once every 16us
void clock_isr(void)
{
    micros_counter += 16;
    
    if (micros_counter >= 1000)
    {
        millis_counter++;
        micros_counter -= 1000;
    }
}

uint16_t millis(void)
{
    di();
    uint16_t ms = millis_counter;
    ei();
    return ms;
}

uint16_t micros(void)
{
    di();
    uint16_t ms = millis_counter;
    uint16_t us = micros_counter;
    uint8_t counts = TMR2;
    ei();
    return ms * 1000 + us + (counts >> 4);
}

void delay(uint16_t time)
{
    int16_t end = millis() + time;
    while (end - millis() > 0);
}

void delay_ms(uint16_t time)
{
    int16_t end = micros() + time;
    while (end - micros() > 0);
}

I am going to create a generic CMSIS version for ARM.

ISR(TIMER1_OVF_vect)
{
	micros_counter += 1000000 / (F_CPU * 4 / 256);
	if (micros_counter >= 1000)
	{
		millis_counter++;
		micros_counter -= 1000;
	}
}

You add a million, and your subtracting a thousand?

Also, pop quiz, what’s the largest value a uint16_t can hold? Anyone?

technix:

void delay(uint16_t time)

{
    int16_t end = millis() + time;
    while (end - millis() > 0);
}

void delay_ms(uint16_t time)
{
    int16_t end = micros() + time;
    while (end - micros() > 0);
}

Unsigned subtraction is always 0 or > 0 but you are safe to hit (end - micros == 0) as long as an interrupt doesn't take 16 cycles at the wrong time.

What you have is a count-down to zero. Below counts up to or more than a set goal. If it misses due to interrupt it will catch up soon enough.

 void delay(uint16_t time)
{
    int16_t start = millis();
    while (millis() - start >= time);
}

void delay_ms(uint16_t time)
{
    int16_t start = micros();
    while (micros() - start >= time);
}

Also note that 65535 millis is 65.535 seconds, the timer limit of 16 bit unsigned.
Using unsigned long for time gives you millis to 49.71... DAYS, micros to 71.58... minutes.
One member here made a timer library that uses 64-bit unsigned, count millis till the sun goes red giant.

Jiggy-Ninja:

ISR(TIMER1_OVF_vect)

{
micros_counter += 1000000 / (F_CPU * 4 / 256);
if (micros_counter >= 1000)
{
millis_counter++;
micros_counter -= 1000;
}
}



You add a million, and your subtracting a thousand?

Also, pop quiz, what's the largest value a uint16_t can hold? Anyone?

The compiler puts F_CPU there and optimize the entire subexpression away.

1000000 / (16000000 * 4 / 256) = 64.

[quote author=GoForSmoke link=msg=2911947 date=1473280887]
Unsigned subtraction is always 0 or > 0 but you are safe to hit (end - micros == 0) as long as an interrupt doesn't take 16 cycles at the wrong time. 

What you have is a count-down to zero. Below counts up to or more than a set goal. If it misses due to interrupt it will catch up soon enough.

 void delay(uint16_t time)
{
    int16_t start = millis();
    while (millis() - start >= time);
}

void delay_ms(uint16_t time)
{
    int16_t start = micros();
    while (micros() - start >= time);
}

Also note that 65535 millis is 65.535 seconds, the timer limit of 16 bit unsigned.
Using unsigned long for time gives you millis to 49.71... DAYS, micros to 71.58... minutes.
One member here made a timer library that uses 64-bit unsigned, count millis till the sun goes red giant.
[/quote]
I guess I can typedef a timer_t (unsigned) and timespan_t (signed) so I can choose from 16-bit, 32-bit or 64-bit timers as needed.

technix:
The compiler puts F_CPU there and optimize the entire subexpression away.

1000000 / (16000000 * 4 / 256) = 64.

Funny, I get 4.

1000000 / (1000000 * 64 / 256 ) =
1000000 / (1000000 / 4 ) = 4

technix:
I guess I can typedef a timer_t (unsigned) and timespan_t (signed) so I can choose from 16-bit, 32-bit or 64-bit timers as needed.

The important part is whether you count up or down, the ability to catch an event after an exact time.

it isn't always easy to port those functions over to another platform

Most timers will have the ability to create a periodic interrupt. That's what timers DO. The complications arise when you want to use the same timer for your period interrupt AND some other functions (like PWM on the Arduino AVRs.)
ARM Cortex is trivial because they have a timer completely dedicated to that "periodic interrupt" function ("systick")
(Although on SAMD10, I found it convenient to use the "RTC" timer instead - I could get it to count every microsecond and interrupt every millisecond making the higher-level functions especially trivial.)

I'm not sure why you're running your timers at high clock rates (>1MHz) when you only want to time microseconds. I guess it'd be useful for high frequency PWM ("as high as possible"), but you didn't SAY that was a goal...

I don't think your code:

    int16_t end = micros() + time;
    while (end - micros() > 0);

(and similar) properly handles wrap around of the counters. See http://playground.arduino.cc/Code/TimingRollover

westfw:
I don't think your code:

    int16_t end = micros() + time;

while (end - micros() > 0);




(and similar) properly handles wrap around of the counters. See http://playground.arduino.cc/Code/TimingRollover

The unsigned subtraction of micros() from end handles any rollover but the test is only good for 1 microsecond when micros() == end. Miss that and next chance is in 65536 micros.

Putting the subtraction in the order of ( end-time - micros() > 0 ) makes a countdown.
Putting the subtraction in the order of ( micros() - start >= wait ) makes a count up with catch-past limit.