Hey all,
I am using an atmega328p clocked to 20mhz. Below I have the following code that seems to mess up the light output control for dimming. I have a hunch of what the issue is, but I dont know how to manipulate it.
So the problem is, as a test, I had 3 mosfets connected to pwm pins 3,5,6 to control the brightness of some LED's to ramp up and down. For some reason, on the descent it does this jittery repeated half cycle. The power source is a buck converter that switches at 175khz (much faster than ~10khz), so I didn't think that was the problem.
This all works as expected with the prescalar set to the default values. (ramps up and down smoothly). with the frequencies of ~1khz and ~500hz respectively.
//set timer0 to prescalar of 8, pwm frequency of pin 5 and 6 to 9765hz fast pwm method
//millis and delay are now WRONG!!!
TCCR0A = _BV(COM0A1) | _BV(COM0B1) | _BV(WGM01) | _BV(WGM00);
TCCR0B = _BV(CS01);
//set timer2 to prescalar of 8, pwm frequency of pin 3 and 11 set to 9765hz fast pwm method
TCCR2A = _BV(COM2A1) | _BV(COM2B1) | _BV(WGM21) | _BV(WGM20);
TCCR2B = _BV(CS21);
I have read something about the output compare registers OCR0/2A and OCR0/2B. I think that is what I want to manipulate, but I don't know why or how.
I get any color correctly, it just jitters when fading down instead of being smooth, any thoughts?