Hi. As you might have probably guessed from the topic subject, I have a problem with IR LEDs.
I am a beginner in the world of electronics, I also own my first Arduino Uno I bought to play around with electronics
(got it in the Arduino Starter Kit 1)
The goal of my project is quite easy, send IR codes to my Air Conditioner / TV set.
- I have used my IR receiver ( + de-modulator
38 kHz, which is quite standard), to obtain codes for my AC by pointing my remote at it and recording codes ( turns out my AC uses
UNKNOWNcodeset and the TV uses
NEC). In case of the
UNKNOWNcodeset, I recorded raw codes and used
sendRAWfunction to send them to the AC.
I am just posting about it here so other people with the same issue might find it helpful
- I have used
IRremotelibrary to create this logic, both for receiving and sending codes
- I managed to test it on my TV set, and it worked (yeey :D) but I had to place Arduino like 5 cm in front of the receiver for it to work. This is where things started going south.
My first conclusion here was that the LED did not have enough power (and by pointing my phone's camera at it, it became clear that's the case, the light was too dim, especially compared to the light coming from my remote which was as twice as bright - if not more).
I have read online that most LEDs require around
20 mA. What stumped me is that the LED I have can tolerate up to
100mA for maximum brightness (at least it says so in the documentation under "forward current").
I know that for optimal performance, it might be best to keep my LED at 80% of the power, which should be
So, how the hell do I supply
80 mA to it and still control it from my digital out pins? Transistors?
My Arduino uses
5V VCC on its digital pins (verified with voltmeter).. I also managed to record exactly
80 mA with amp-meter (i guess it can only use this for short periods of time, so I gave up not wanting to fry my Arduino).
As you might have noticed, this conflicting information makes a beginner like me very confused.
So, I can measure
40 mA is recommended maximum. LEDs require
20mA for optimal performance but turns out my uses much more..?
LED specs are:
Forward Voltage: Max 1.8 V ( Typ: 1.5 V ) Reverse voltage: 5 V Forward Current: 100 mA
This tutorial showed me that the amount of voltage generated must be equal to the amount used.
My question to you is: is my approach correct here and can somebody help me out with the resistors I need to use to properly handle this voltage drop? I only have resistors from the starter kit ( 220 ohm, 550 ohm, 4.7 K, 5 M, 10 M, 15 M )
My first idea was:
5 V - 1.8 V = 3.2 V // use a resistor with the LED which will drop 3.2 V before GND 3.2 V / 0.080 A = 40 ohm // that resistor must be 40 ohm
But, if Arduino cannot handle
80 mA, this is useless.
Also, might be hard to create a
40 ohm combined resistor with what I have.
Let me know where I am wrong.
Thank you for any help!