Voltage or Current ?

I learnt that when we drive a LED, we should consider the working voltage. But then i asked myself about the current because i cant calculate the resistor i need then i read that you can take current as 20mA.

Here is the question:

The thing that runs the LED is the current ( or electrons ) not voltage. Voltage is like a pressure difference which moves the electrons here. Then why do we firstly consider the voltage or why do LEDs have a working voltage not current? ( like red led generrally works with nearly 2V )

Thanks

LEDs are not a linear device. The forward voltage, not the working voltage, is the voltage which the junction (the diode) turns on. Below that voltage, the LED is an OPEN. At that voltage (and slightly higher) it becomes a SHORT.

Because you can think of it as a SHORT once it is turned on, that means you need to limit the current with a linear device, like a resistor. Using ohm's law you can calculate a resistor value to set the LED's current. (In a series circuit voltage drops add up and current is the same through each device.)

The current for an LED is its rated maximum. You do not have to always set it to 20mA!

I did a short video tutorial on LEDs and Current Limiting resistors: https://www.youtube.com/watch?v=81zNcctopBI

To calculate the resistor required then

R = (Vs-2)/I

where R is the required resistor in ohms

Vs is your supply voltage (5v with the Arduino supply)

I is the required current through the LED in Amps (see LED spec) 10mA will usually work (0.01 amp).

If you leave the current in mA then the resistor value will be in Kilohms.

The 2 is a rough estimate of the voltage drop across the LED which is required to start conducting.

Dead_Ard

The 2 is a rough estimate of the voltage drop across the LED which is required to start conducting.

The actual value here will also depend on the LED's colour.

ElectroniCat: I learnt that when we drive a LED, we should consider the working voltage. But then i asked myself about the current because i cant calculate the resistor i need then i read that you can take current as 20mA.

Here is the question:

The thing that runs the LED is the current ( or electrons ) not voltage. Voltage is like a pressure difference which moves the electrons here. Then why do we firstly consider the voltage or why do LEDs have a working voltage not current? ( like red led generrally works with nearly 2V )

Thanks

Hi,

LEDs run on energy which is the product of voltage and current, but if we supply it with a proper current it will assume the voltage that is right for the diode at the time. That voltage may change a little, but if you keep the current about the same all the time (like 10ma, 15ma, etc.) then the LED lights and the voltage drop is determined by the LED itself.

The voltages you read about are the characteristic voltages. That's like 1.2, 1.8, 2.2, 3.5, etc. These are the approximate voltages the LED will assume when driven with the current level it was designed to run at.

In a way this is the opposite from a light bulb that you screw or twist into a socket at home. The bulb requires a certain voltage, and if you apply that voltage it assumes a current that is right for the power of the bulb as it was designed. For the LED however we apply a current and let the device assume whatever voltage is appropriate given the internal chemistry, although we know from experience that the LED will choose a voltage that is close to the characteristic voltage.

So for the bulb, we choose the voltage and the bulb assumes the right current. For the LED, we choose the current and the LED assumes the right voltage. In each case the variable we dont really know at the time stays within a certain range given the normal operating power level of the device.

the LED will choose a voltage that is close to the characteristic voltage.

So the LED has a choice? Not the best use of words.

The common model of diodes acting like a constant voltage load is only a very crude approximation. It is good enough for small signal diodes and indicator LEDs, but anything that needs higher power or higher precision will need a more sophisticated model.

The most proper way to say it is close to what James C4S said, LEDs (and diodes in general) have a nonlinear IV (current-voltage) characteristic. In particular for diodes, the current as a function of voltage is exponential. Above a certain voltage (often called the forward voltage) current increases extremely rapidly as the voltage increases. If you just connect a constant voltage source straight across a diode willy-nilly, it will be very hard to ensure a proper amount of current flows through the diode. Small changes in the voltage source's output can produce huge changes in diode current.

So a circuit is necessary to tame the diode's current. 3 methods are in common use.

The simplest, cheapest, and crudest method is to place a single component with a more manageable IV characteristic (commonly a resistor) in series with the diode.

This has the disadvantage of being power inefficient and imprecise. For efficiency, high power LEDs used for illumination will often be driven by a switch-mode power supply configured to regulate the current instead of the applied voltage. For precision, many display drivers have linear current regulators in them to ensure uniform brightness across all LEDs in the display.

Thanks all of you, thanks a lot :)