Sorry for the noobish question; I am brand-new to electronics and confused as this seems to contradict what I have learned.
I have a green LED, rated 2.6V at 28mA max. I'm running wire out of my Arduino's digital pin 8 to the top row of a breadboard, and a separate wire in a different row on the breadboard out to the Arduino's ground. Connecting the pin 8 row to the ground row is the green LED and a 100-ohm resistor in series.
My confusion is this. I was led to understand that if you did not have an appropriately sized resistor in series with the LED, it would quickly burn out due to the high current flowing through it. However, when I remove the 100 ohm resistor, the LED continues to be lit, but more brightly.
I know that digital pin 13 has a 1K-ohm resistor on it, but the other pins have no resistors in the circuit from the digital pin to ground (according to the schematics). What is my misunderstanding here?
For the curious, my code is as follows:
int ledPin = 8;
Thanks so much!
It's burning brighter because it's drawing more current. How much more? You would have to measure it with a current meter. The AVR chip has a 40ma max current draw before possible damage.
It is a very poor practice to drive LEDs directly from an Arduino output pin without a series resistor, the risk you take is all on you. However a replacement AVR chip, with a bootloader burned into it, is availble from several sources for around $7, if you do ever end up with a burned out I/O pin.
On most arduinos pin 13 doesn't have a resistor inline with the pin, only inline with the built in LED. If you want to put an external LED on 13, it should have a resistor too. I believe some early arduinos do have a resistor to the PIN and you can use an LED directly.
I understand why the LED gets brighter, and that I'm supposed to run the LED in series with a resistor to keep from causing damage. However, my confusion lies in the fact that the LED isn't burning out, even though the LED is rated at 28mA max. Why is this happening? I accidentally knocked out the resistor before wasn't my intention to practice poor electronics, but thanks for the reminder!
You're absolutely right, I misspoke. The 1K resistor is in series with the LED on the ground line. Thanks for clarifying :).
So where is my misunderstanding of electronics? Why is the LED rated at 28mA, 2.6V not burning out when connected to the digital pin 8 and ground without a resistor in series?
Thanks for your help everyone!
Absolute maximum rating is not the same as typical rating.
I bet if you left the no-resistor LED and an exact equivalent with a resistor lit next to each other, the no-resistor LED would stop working a long time before the on with the LED.
Again if you would just measure the current being drawn by the LED without a resistor you would see if you are exceeding the max rating of either the AVR output pin or the LED or not.
If the measured current is below both the max rated current of the AVR (40ma) and the LED (28ma) then you might leave it that way, at your own risk. A properly calculated series resistor is the correct way to do it.
So where is my misunderstanding of electronics?
It is that specified or even absolute maximum ratings do not result in an immediate destruction of a device.
However, if you exceed these ratings then the manufacture can't guarantee it will continue to work. In fact damage will occur but you won't necessarily see the consistence of the damage immediately. It will show up in shortened life. Of course exceed it by a lot and you will kill it immediately.
It is rather like drinking alcohol. There are safe levels, dangerous levels and lethal levels. You can go on getting drunk night after night and you liver is taking a hit, it is being damaged, but you don't know about it yet.
The only difference is that with electronics, manufacturers can stress hundreds of devices to see exactly where damage will occur. With alcohol and people that approach is considered unethical.
It is rather like drinking alcohol. There are safe levels, dangerous levels and lethal levels.
Nice analogy (.. but makes me feel very sorry for my liver )
LEDs should always be treated as constant current devices. The resistor associated with the led should be determined based on the nominal current draw for the LED and the amount of voltage to be dropped from the vcc supply (5vdc for the digital pins and digitally selected analog pins). For a 2.6v 28ma LED you would use a . This is a relatively high current and voltage LED, most are driven at 20ma or so and 1.7 to 2.2 volts, . This will yield the highest rated output with out damage to the LED. The greatest threat in using the LED without the current limiting resistor is to the processor which is current limited to 40ma per output. I'm sure your are current limiting the LED artificially through the processor now and possibly causing damage to the chip. Always watch your current, it'll kill your CPU faster that anything else.
To dim your LED you need to turn it fully on and fully off rapidly using the PWM output at 10-20K and vary the duty cycle from 0 (off) to 100%(full on). This is the way to achieve vary dim glowing LEDs
If you're only wanting the LED as an indicator, you can run them way below their rating. 2V 20mA LEDs work nicely as indicators with a 1k resistor on 5V (~ 3mA). No danger of stress to either LED or micro controler. Kind of like drinking a small bottle of Lager (gnat's p*ss) every other day
Just tried one of these new fangled 'water clear' red LED's with a 10k resistor on 5v, it wasn't going to illuminate very much but there was no doubt it was on in normal indoor lighting. An old red one didn't want to know, but it worked at 1K.