Hi. Part of the GUI in my code I am designing consists of a rotary knob and a display where user can select a frequency in the continuous range from 1Hz to 500Hz. Need to choose a step yet (eg 5/10Hz) but the fact doesn't change.
Obviously in the code I need to handle with the period related to the selected frequency.
I use the internal 8bit timer-0 with clkI/O/64 (from prescaler) that is 1 tick every 4us running at 16MHz.
Setting the OCR0A to 249 gives me a neat 2,0003ms period. So every ISR call related it increments every 1ms a variable that in loop once reached the exact value (e.g. 10ms 100Hz) triggers the event. That solves me also the problem for dealing with low frequencies.
This obviously works fine for frequency that have integer division result.
if user selects for example 140Hz that gives me the period of 7,14 ms I am stuck.
So I could reduce the OCR0A value to 24 giving me 10 times more the ISR call but then I have to increment the size of variables especially for low frequencies.
I need a formula that converts me a frequency into the right milli /us period to obtain the right resolution.. maybe from 1 to 20Hz in a 1-Hz step and from 20 to 500 in 10Hz step.
The highest available prescale is 1024 so the lowest frequency Timer0 can reach is 30.5175 Hz. You could use software for frequencies under 30 Hz (under 32768 microseconds) and hardware for frequencies from 30 Hz down to 1 Hz.