So I’m revisiting a circuit to see how I could cut down on cost, board space and assembly time and I’ve started doubting myself in one of my improvements.
Basically I have a two-sided board and an SMD RGB LED on each side. The setup is identical on both sides – 5v to a ~330ohm resistor to each color channel. This works well but I think I can do without those 3 additional resistors on the bottom side.
So the solution seems simple – decrease the resistor values to allow more current for each LED channel and connect the LEDs in parallel to the resistors from 5v. I’ve implemented the idea in a simulator and it seems to work but I’m worried I’m missing something. The current is divided correctly but I’m not sure each LED will reach the forward voltage drop to turn on.
The simulator tells me the voltage drops by about half to around 2.5v but even with the LED forward voltage drop around 2.9v it thinks the LED will turn on. I’m not sure if this is a bug in the simulator or I am missing something. If I remember my theory correctly the entire line stays 5v except after the LED at which point it drops so they should light up despite being split from the same power source.
I’ve supplied the schematic from the circuit simulator as an attachment. Each of those circuits should provide an equivalent level of brightness for each LED (I think it was ~8ma per LED).
Apologies if this sounds like an ignorant question and thanks ahead of time for any constructive input!
Simulators and the real world are from different planets.
The only way you can (simply) run multiple LEDs using a single resistor is if they are all wired in series. even then you should only wire LEDs that require the same current and also ensure the sum of the individual forward voltages add up to no more than say 80% of the supply voltage. You need the 20% or so dropped across the resistor to provide current stability.
In your example picture all leds have the same intensity. But only of they are all the same leds.
I assume that your smd RGB led has three leds. They all have a very different voltage drop. It could be from 1.8 (red) to 3.4 (blue). You have to know the voltage drop to calculate the current. And you have to see for yourself to see if the same current seems like the same brightness.
In the end, I think you need many resistors....
The problem with the example using a single resistor is you have no way to guarantee that the current will divide equally between the two leds. There will always be some small forward voltage drop variation from led to led, unless you take time and hand select from a batch. Tried and true method is that each led should have its own series current limiting resistor.
Definitely cannot parallel LEDs of different colours, only the red one will light up as it needs 1.7 to 1.9V or so, the other colours won't be starting to glow till 3V or so. Separate resistors (or constant-current driver outputs) are required.
Yes, if you need to go constant current, there's a simple circuit using a LM317 that can provide constant current, and can drive as many LEDs in series as you can put in, until the total voltage drops comes real close to the supply voltage. This doesn't guarantee equal brightness, just equal current. In parallel won't work unless you have some really weird control schema, or wrestle with resistors to drop the voltage to the lower voltage drop LEDs, and even then you start to be at the mercy of the tolerance of the individual components, so one might work fine, and the same circuit with different components could have one not light up and another one get toasted because of variances.
If you want to small things up, consider surface mount resistors. They are tiny little buggers that you can solder one end to the LED pin and something else to the other end of, or surface mount them on two PC board tabs. You can get a tape roll of 100 units of 330r (330 ohms) for a reasonable price, and be set for life. Just check power dissipation on them.