Dear fellow arduino users,
I made quite a lot of research on the topic of driving high power LED (typically, 1W or 3W RGB led). Their usual characteristics are a voltage dropdown or 3.5V and a typical current of 350mA.
Many people drive them with a simple transistor (or some darlington structure like ULN2003).
But several members of this forum say this is a wrong way of driving them, and a constant current driver is needed.
My questions are :
-Is it wrong to use a transistor or a darlington (like here)? Are there any risks for the LED? Any heating risk?
-If yes, what is the correct way to drive them? A constant current driver I guess?
Thank you for your answer, I'm quite confused by the differences between all the schematics used to drive high power LED I found on the internet.
The 22 Ohm resistors are in that circuit to limit current. If they are sized correctly they will prevent over-current on the LED. A constant-current driver will do roughly the same thing and provide better current control if the input voltage might vary (like in a car power system where "12V" can go over 14V.
If you wish to use high power leds to near their maximum rated operating current then you should always use a constant current driver source. If you are willing to run a power led at substantially lower then it's maximum rated current then the typical (properly calculated size) series resistor driven from a voltage source and switched by a arduino controlled transistor would probably work OK.
If they are sized correctly they will prevent over-current on the LED.
The point is that the forward voltage drop changes with age and temperature, so as the LEDs warm up the current will change and push it over the edge.
For a reliable long lived design you need a constant current driver.