10mm Ultra Bright LED with no resistor?

I plan to use a 3.3v wall wort to power a 10mm Ultra Bright Blue LED. It’s specs say it is 20ma and 3.2v Typical and 3.8v Max.
I ran the numbers and came out with a 1 or 2 ohm current limiting resistor in this case. Seems silly. A wire probably has that much. So I tried using the 3.3v output from Arduino to power the LED with no limiting resistor. Seems to work fine for at least 30 seconds. Did not blow up or over heat. Is this ok to do long term or will the LED or wall wort cook?

Thank You

Maybe it has its own resistor?

What current do you measure which is actually flow through the LED?

Never did :slight_smile: Trying to figure out a way to hold my probes and the LED and the connections in my hand all at once :slight_smile:

Alligator clips?

Can't find em. Anyways, I managed to hold everything together long enough to get a reading. I am getting 12ma at 3.3v. Not at all what I expected. If the meter is to be trusted with such a tiny current anyways. I wonder if increasing the voltage/current until it is at 20ma will significantly increase brightness. It already seems pretty bright honestly. I wonder if it will be diminishing returns.

Has the potential to bite you and cause problems.
I would get a LED that has a forward voltage drop of 1.5 to 2.5V and use a series resistor.

So I tried using the 3.3v output from Arduino to power the LED with no limiting resistor.

That might seem to work as a one off but remember the spec said that the forward voltage is:-

It's specs say it is 20ma and 3.2v Typical and 3.8v Max.

So with some LEDs you will be within the current spec, some ( like you found ) below it and others it might not light at all.

Bottom line is you do not have enough voltage available to drive this LED reliably. Note also the forward voltage changes with the age of the LED and also the ambient temperature. So while you might get away with it sometimes it is bad practice.

That makes more sense. I was thinking it was up to me to decide the voltage should be hire which I always thought the diode chose the forward voltage by it’s design.

Thank You.

LED voltage drop is a function of design and color (plus age, junction damage, etc)

And the CMOS output driver in the AVR also has a 0.3-0.6 Voltage drop typically based on sink/source connection to the load. Your (supply voltage - (LED Vd + CMOS Vd) ) / LED desired mA = the proper resistance.

If you are working with a LED that has a Vdrop too great for 3.3V, then a MOSFET transistor can be used to source the LED to a +5 or higher Vc such that you can get the correct current flow for consistent brightness.

In my designs, I often use a buck DC-DC converter to derive the 3.3 uC & sensor voltages but source the LEDs back to +5 or +12 that feed the 3.3V DC buck unit. Use care in automotive use as the +12 volts can go upwards to 17V during lead-acid battery charging. In these cases you may wish to consider both +3.3 and +5 voltage regulation.