I am lighting a dollhouse. The LEDs will be white and on a button push, they will fade to yellow, green, and red (depending on which button). I am using the PWMAllPins example in the playground to let the atmega chip act as a LED driver. I have modified the code a little bit and now I understand, pretty well, what the example code is doing. I can now define an array with 14 values between 0-255 and the LEDs on those pins light accordingly. Long story short, I have the chip acting as a LED driver pretty well. The problem now is that I would like the chips to do a little bit more math and maybe add a LCD or serial output. Adding this into the loop creates a flicker in the "pwm" output.
I now want to use 1 arduino chip as the logic and 1 as a simple led driver that will only accept an intensity control and light LEDs accordingly.
I now need to know what the best way to integrate the chips would be.
Would I2C be the best here? I have absolutely no knowledge of it but I know there are tons of examples out there.
Would PWM outputs into the driver's analog inputs work? This is ideal for me because it would allow me to act like I am using a regular RGB led in the master sketch. (I only need 3 colors which will be replicated 4 times for a total of 12 pins out but 3 values in).
Would serial work? From what I believe, regular serial would be too slow (anything in the main loop that takes time, creates flicker in the LEDs), but I do understand it pretty well so it would be an easier solution to implement than I2C.
I would love to use the PWM method but I'm not sure how the analog inputs would handle this. I'm also wondering if the analog inputs would be accurate enough to take an input this way.
Sorry for all the wordiness!!!
Thanks for the help, -Adam