I have a project where I have some LEDs being dimmed with PWM (using a TLC5947) and an amplifier playing music from an MP3 module (WT5001 I think is the part number), and I get noise from the PWM on the amplifier.
I had a similar problem in the past and a few solutions were offered:
http://forum.arduino.cc/index.php?PHPSESSID=jgjq3gtpp7sttahf5le3f4tgl0&topic=150336.30
It was suggested I add a capacitor for decoupling. I have done this. A 220uf capacitor only reduced the noise a bit. And a huge 4700uf capacitor I had laying around only did a slightly better job. So obviously I need something other than just a capacitor on those LED modules.
In the old thread it was suggested I use an inductor. I don't have one laying around, but I might try that.
Another suggestion though was to just use a resistor to get the current flowing around the capacitor. But DC42 said that that would only work if the voltage drop that would result would be acceptable.
The thing is, I don't know how to compute that voltage drop. I know it's a really simple thing, but I can't figure it out after looking at several tutorials.
My input to the TLC5947 is 5V. I'm not sure off the top of my head what the minimum voltage they can run on is, but it's probably not an issue. I do however know that the voltage needs to be higher than my LEDs require, so we should probably assume that we don't want less than say 3.5v.
I also know that I have 12 leds attached and they may draw up to 20ma each. So that's 240ma max current draw.
And then this is where I get lost.
P = E x I, and 5v is flowing into the resistor, so presumably P = 5v * 240ma, which is 1.2 watts. But how do I calculate the voltage drop from the resistor? No matter how I rearrange the terms I get 5V. And if I reduce the current as it would be when the leds are being dimmed the voltage actually goes up when I do my calculations based on the resistor I selected. It really doesn't make any sense to me.
The only thing I can think of to do differently is maybe I should be taking the LED voltage drop into account here. So 2.2v drop, leaving 3.8v the resistor has to dissipate (but not really because the TLC5947 is a constant current driver so the resistor doesn't need to dissipate anything to safely drive the led) which reduces the watts going through the resistor which is helpful since it doesn't need to be as large but... I'm still lost and have no idea what I'm doing. I also can't even assume that 2.2v drop for the LED because I have like 12 of them connected to this LED driver so who knows what the hell the voltage drop from the leds actually is.
Maybe I should just take the current I know the LEDs are going to draw and then use the resistance and calculate for voltage? There I get 4.8V for a 20ohm resistor and 240ma of current. That seems like a sane result. But if I up the ohms to 40 now I get 9.6v, which doesn't make sense. Well maybe it means with a 40ohm resistor there I need 9.6v in to get 240ma out? I guess that makes sense. So is the voltage drop in the first example then 0.2V? And so the power the resistor has to dissipate with 240ma going through the 20ohm resistance is .048W?
Maybe that's right but the way I arrived at it is really confusing. I'm not sure what the straightforward way to go at finding this voltage drop is. And I assume I need to know that to know what wattage resistor I need, and what the maximum ohms I can use is so as to minimize that wattage rating.