LOL, what a mistake! I forgot to include the timer init function
so sorry.
But when I included it in the setup function, it worked and also I can control TIMER0 too.
Here’s my code:
void setup() {
I2C_init();
LCD_Init();
TCCR0B = 0x02;
}
void loop() {
TCNT0=0;
_delay_us(100);
num = (TCNT0/2);
itoa(num,str_arr,10);
LCD_string(str_arr);
_delay_ms(1000);
clr_dis();
}
With this setup, the Atmega is running at 16MHz, so instruction cycle should be 62.5ns, prescaler is Fosc/8 so the timer should run at 0.5us
Measuring 100us delay, I should get 200, which is what I have right now on the LCD, dividing the timer result by 2 I should get 100!
Another interesting performance aspect I noticed is that implementing:
while(m<100){m++;}
Instead of
_delay_us(100);
Changes the results, I get 50 timer results with the while loop instead of the actual 100 for the _delay_us.
So the incrementing process takes an extra CPU clock, I think AVR-GCC _delay_us/ms functions apply timers for their functionality.
I’ve done my DH11 timings test and the results are impressive!
I got the precise timings as expected, I couldn’t get the same results with my PIC18F4550, I don’t know why.
The PIC18F4550 worked well for measuring the IR sensor VS1838b received bits, but didn’t with the DH11! I don’t know why.
This is the code:
void DH11_init(void)
{
_delay_ms(2000);
TCCR0B = 0x02; // enable timer0
}
void DH11_read(uint8_t *bits_timing)
{
uint8_t i;
DDRB = 0xFF; // Pin0 output
PORTB &= ~(1<<PB0);
_delay_ms(18);
PORTB |= (1<<PB0);
DDRB = 0xFE; // Pin0 input
TCNT0=0;
while((PINB & (1<<PINB0))); // measuring these pulses for testing purposes
bits_timing[0]=TMR0_MSK;
TCNT0=0;
while(!(PINB & (1<<PINB0))); // DH11 LOW 80us response
bits_timing[1]=TMR0_MSK;
TCNT0=0;
while((PINB & (1<<PINB0))); // DH11 HIGH 80us response
bits_timing[2]=TMR0_MSK;
for (i=3;i<43;i++)
{
while(!(PINB & (1<<PINB0))); // wait for 0 leading bit
TCNT0=0;
while(PINB & (1<<PINB0)); // high bit 0 is 24us or 1 is 70us
bits_timing[i] = TMR0_MSK;
}
}