Hello, I've been following the AVRfreaks tutorial on how to use timers. In doing so, I'm setting the timer using direct port access and whatever else detailed at http://www.avrfreaks.net/index.php?name=PNphpBB2&file=viewtopic&t=50106.
Here's my program:
#include <avr/io.h>
#include <avr/interrupt.h>
ISR(TIMER1_COMPA_vect)
{
PORTB ^= (1 << 3); // Toggle the LED
}
void setup() // run once, when the sketch starts
{
DDRB |= (1 << 3); // Set LED as output
TCCR1B |= (1 << WGM12); // Configure timer 1 for CTC mode
TIMSK1 |= (1 << OCIE1A); // Enable CTC interrupt
sei(); // Enable global interrupts
OCR1A = 15624; // Set CTC compare value to 1Hz at 1MHz AVR clock, with a prescaler of 64
TCCR1B |= ((1 << CS10) | (1 << CS11)); // Start timer at Fcpu/64
}
void loop() // run over and over again
{
}
Which was directly swiped from the above link from "Part 5: CTC Mode using Interrupts". It's supposed to flip pin PB3 from 1 to 0 every second or so (It's supposed to be a 1 hz signal), but when I measure it using the oscilloscope, the on time is 1 mS and the off time is 1 mS, so that's 500 hz. The example uses an example of 1 mhz crystal, so mine's going to be 16 times faster since I'm using 16 mhz, but it's not exactly that.
Furthermore, when I change the "OCR1A = 15624; " line to a smaller number (to make it shorter), it doesn't do anything. It seems to really love 1 mS for some reason.
So, uhhh.... why isn't this doing what I think it should?
P.S., yes, I know the built-in LED is on digital pin 13(PB5), but that had the same problem too.