I'm completely new to electronics and seem to have a basic misunderstanding problem.
I'm preparing to set up an IR Receiver & Transmitter in order to be able to trigger some stuff on the rear breadboard.
My understanding problem only refers to the front breadboard though.
Following image might help to understand my problem
When i measure the Voltage from + to - I get 4,65 Volts.
I know my IR Transmitter (currently the green LED as placeholder) wants to be fed with ~1.5 Volt and 20 mA.
Now I wanted to calculate what kind of resistor I need.
From what I learned from Ohms law I calculated the following:
1,5 V = 4,65 V - 3,15V
R = 3,15V / 0.02 A
R = 157 Ω
I did not find a single resistor which is only a bit bigger, so I decided to take 2x 100 Ω and place them in serial in order to get 200 Ω.
I did this as you can see in the image, but when I measure the voltage at the anode and cathode of the LED I get 2,15 V
Please forgive me this probably really simple misunderstanding error.
It simply doesn't click in my thought process yet.
But if the two resistors should minimize the voltage to max. 1,5 Volt, how can it be that the green LED suddenly has more than 1,5 Volt?
I imagine resistors as something that decreases the volt permanently, like a water hose which is beeing stepped on. But my calculation simply doesn't add up, I'm ~0.6Volt above my calculated value, even thought I used 43 Ω more than needed to get down to 1.5 Volt.
That would be correct if I planned on using a green LED, but I plan on switching my green LED with an IR-Transmitter who only uses 1,5 Volt with 0.02A - so I made my calculation regarding to the IR-Transmitter.
I only wanted to verify on the green LED that I got the voltage down to the level I want to have it, before I replace the green LED with my IR-Transmitter (so it does not brake if I calculated something wrong)
I really am not sure if I completely misunderstand you too right now, or if you misunderstood my plan
Thanks for helping me though!
I'm probably just dealing with this for too long without a pause, my brain doesn't comprehend anything anymore as it seems.
LED are not ohmic (linear) resistors. They maintain about the same (knee) voltage across a wide range of current, like Zener diodes do. So your calculations are right, you subtract the LED voltage from the supply voltage and divide by the desired current to get the required resistance.
Likewise you can calculate the resulting currents if you vary the calculated resistance.
Hmm, I'm really trying to understand everything which is beeing said here, but sadly I'm not deep enough into the whole topic to really understand everything.
I will change the green LED with the IR Transmitter real quick and see if the voltage changes in comparison to the green LED.
@shivahaze is using a GREEN LED.
But only to test the circuit, the misunderstanding is that @shivahaze believes the resistors set the voltage on the LED.
Where really the resistors set the current.
Oh wow, okay the IR Transmitter really does have ~1,2 V, which correlates with the calculations I made.
So the green LED actually misleaded me.
I still don't know exactly why and what I did not understand fully yet, but that was a step forward!
I will take some time to read the links you guys posted and hopefully grasp what you're all trying to tell me.
I think at least part of your misunderstanding comes from thinking that the resistors control the voltage across the LED, they do not. The voltage across the LED comes from the internal workings of the LED and has nothing to do with the circuitry around it. As long as you don't exceed the maximum current of the LED it will always drop roughly the same voltage, which depends on its colour.
While being careful not exceed the maximum current for the LED try varying the current by using different values of resistors. Measure the voltage across the LED, you should find it varies only by a small amount.