Would it be better to use this formula?
R = (5 - 2) / 0.020 = 150 ohms
So, if I do that i am finding out the maximum current?
So it is the lowest value resistor that i can use?Would you say if i get to a value of 220 Ohms that would be safe to use or should i always leave a margin and use maybe a bigger resistor?
Do you need the maximum current?
with the multimeter leads on 20A
i plugged a LED with a 220Ohm resistor.The voltage from the +5 to ground was 4,89V and across the LED it was 3,08V.But i was already using a resistor, so i am guessing that is affecting my readings...But if i use the Ohm's law like this then i get 6mA.Could this be right?
Quotewith the multimeter leads on 20ASo you expect to see 6mA on a 20A full scale meter? How many digits does it have.Put the range down to 200mA and then you stand a chance.
So, my problem is that i am trying to figure out how to find the right Resistor to use with LEDs.
Questions about LEDs and resistors are never simple...
The forward voltages are closely related to the energy of photons of light of the relevant colour, which can be calculated from the wavelength w (in nm) as 1240/w (in volts).