Ohm's law

Hi guys. I have a huge misconception about ohms law, so please help me out here. I don't really understand the relation between intensity, voltage and resistance:

1. I don't understand how if you put a 9V battery followed by a 340 ohms resistor you will be left with about 20ma. For example, if you have 2 batteries with different amounts intensity, will it always give you 20ma after you put the resistor?

2. If you have 20ma and you put a 20ma battery will you end up with 0ma?

3. What does a resistor really do? Does it always reduce intensity? And how does it reduce voltage?

There are two things to consider here.
First, all things connected in series will have the same current.
So battery + to resistor '+', resistor '-' to LED '+' (anode), LED '-' (cathode) to battery -. One loop.
The battery has a voltage. The LED will have a voltage drop across it once enough current flows to turn it on. The resistor will have a voltage across it to complete the voltage loop.
Vbattery - Vled - Vresistor = 0.
If Vbattery = 9V, Vled = 2.5V (in the range for a red LED), then Vresistor must be 9V - 2.5V = 6.5V.
Current is then Ohms low, V = IR, or V/R = I. So 6.5V/270 ohm resistor = 0.024A, or 24mA.
Similarly if you want to set the current at 20mA, then V/I = R, so 6.5V/.02A = 325 ohm.
If you use a transistor to control the current flow, then there will be another voltage drop to consider with a BJT transistor (NPN like 2N2222A, with Vce of maybe 0.5V or 0.7V), or a resistance Rds of a few ohms down to hundredths of ohms (0.01 ohm) with a MOSFET.

A resistor just limits current flow. Voltage across the resistor is determined by the current flowing in it, V = IR.

Ohms law , pretty simple,
E = voltage
I = amperage
R= resistance

Basically the resistance value is like a water line reduced in size so say a 1 inch line would allow a certain flow where as a 1/8 inch like would reduce the flow of water. I believe it is safe to say that the resistor dissipates current through heat. The larger the wattage and the value of the resistor the more current it can handle.

Probably not what you are asking but the formula follow as such:

E=I*R
I=E/R
R=E/I

Thanks! But there is still something i don't get: the Uno board can output 20ma from the I/O pins. Why would you need to put a resistor to limit the current if 20ma is what you need.
The voltage drop of my led is 1.8V, and the current is 20ma:

(5 - 1.8 ) / 0.02 = 160 ohms.

Thanks!

The 20mA is the SAFE limit. It will deliver much more, but may well be damaged. Hence you need an external current limiting resistor.

Allan.

The Uno outputs are just a transistor between Vcc and the pin or Gnd and the pin. There is nothing to limit current, that must be done externally. If not limited, you can fry the output transistor if 40mA is exceeded. Above 20mA the Rds of the transistor kicks in and High outputs will start to drop, and Low outputs start to rise as current x Rds = output voltage.

A couple of things to know:

- Most voltage & power sources are "constant voltage". That is, as long as everything is operating normally a 9V battery puts-out (about) 9V and an Arduino output-pin puts-out (about) 5V when high.

The voltage is always there, even with nothing connected. But, if the connected resistance is too low, and the current is too high, the voltage will drop. With a 9V battery, the only damage would be shortened battery life. But in the case of the Arduino or transistors, MOSFETs, etc., there is danger of permanently killing something.

This constant-voltage thing is NOT a law of nature like Ohm's Law. It's just how "most things work". Obviously, there are variable voltages but the idea is that even most variable-voltages are controlled by something else, not the connected "load" resistance.

Another exception is constant-current power supplies used for high-power LEDs. But again, they are only constant current when ''operating normally". There is a voltage limit to a constant-current supply so if you connect a very-high resistance you may not get a very-high voltage. '

- LEDs are diodes, and like all diodes they are non-linear. That means their resistance changes with voltage. At low voltage, an LED has very high resistance. As you go over its normal operating voltage (forward breakdown voltage for a regular diode) the resistance drops to nearly zero-Ohms. If there's nothing to limit the current "something" will burn-out... The LED may burn out, or the Arduino may burn-out, etc.

If we limit the current to the correct amount, the voltage across the LED "magically" falls-into-place where the LEDs resistance is just-right and everything works nicely.

I understand almost everything. Imagine if you had a 5V 20ma power supply and you want to power up a 20ma led with a voltage drop of 1,8 V. You have the correct amount of current, but the voltage is too high. Do you need a resistor? If so, why? If not, is it really OK to power up an led with 5V when its voltage drop is only 1.8?

Also, I don't really understand what it means that a motor needs for example 9V. If you have a 12 V battery that has the same intensity as the motor, how to you reduce the remaining 3 V ? If you put a resistance, wont it reduce current too?

Thanks anyways, and sorry about asking these basic questions, i just seem to have misunderstood everything from the beginning.

Ohms law is just a statement that resistors have a linear relationship between the current flowing
through them and the voltage needed to cause that current - its what defines a resistor.

What happens in the whole circuit is determined by this and other relationships that you have to
solve in parallel - but Kirchoff's laws and some rules of thumb help you figure this out for simple
circuits.

The classic supply, resistor, LED circuit has a constant voltage supply, an approximately constant
voltage load (the LED), so the voltage across the resistor is easy to determine, hence the current

To analyze what happens with Arduino pins carrying large currents you need to know that the Arduino
output transistors (for certain Arduino boards) are about 30 to 40 ohms when active (this can be
figured out from the datasheet)

A supply with a current limit has a conditional equation governing it - if the current is below the limit
it acts as constant voltage, otherwise it acts as a constant current supply - this complicates the
analysis.

Stop using the word intensity it means nothing in this context.

When you say a power supply is so many amps then that is the maximum it will delever.

When you connect a load to a power supply the current that will flow through a load is determined by the resistance of the load.

If this current the load wants is greater than what the power supply can provide the voltage of the power supply drops and so does the current until the current drawn by the load is what the power supply can give. This is a bad situation and should not be allowed to happen because it damages the power supply.

You always need a current limiting device when driving an LED a resistor is the simplest example of this is a resistor. A complex example is a circuit that monitors the current and adjusts the voltage so that the current does not exceed a set current. This is called a constant current supply.

The bigger the voltage the more current it will try to drive through a given load.

icedgoal:
I understand almost everything. Imagine if you had a 5V 20ma power supply...

Do you happen to mean 20mah (20 milliamp-hours)?

resistor dissipates current through heat.

The accepted concepts are:

1. Energy is dissipated as heat in the resistor; not the current.

2. Current remains the same along the loop. It is the carrier of electrical energy from source to load.

3. The current originates from the point of higher potential and sinks at the point of lower potential.

Grumpy_Mike:
Stop using the word intensity it means nothing in this context.

Except that it's where "i" for current comes from.

icedgoal:
Hi guys. I have a huge misconception about ohms law, so please help me out here. I don't really understand the relation between intensity, voltage and resistance:

1. I don't understand how if you put a 9V battery followed by a 340 ohms resistor you will be left with about 20ma. For example, if you have 2 batteries with different amounts intensity, will it always give you 20ma after you put the resistor?

2. If you have 20ma and you put a 20ma battery will you end up with 0ma?

3. What does a resistor really do? Does it always reduce intensity? And how does it reduce voltage?

Should use words 'current', 'voltage', and 'resistance'.

If you have a typical resistor (that you can buy from the electronics shop), you can generally use a formula that conveys Ohm's law that allows you to calculate the current I through the resistor if apply a voltage across that resistor. Or, it can allow you to calculate the voltage V across the resistor if you know the amount of current flowing through that resistor.

The formula is V = I R ..... it is Ohm's law for resistive components.

There is also another version of Ohm's law that deals with 'impedances', but no need to consider that one for now.

1. V = 9 Volt, R = 340 Ohm; I = V/R = 9/340 = about 0.026 Amp.

2. A battery is normally considered to be a source voltage. Also, it is necessary to convey a question concisely. At the moment, the question has not been worded in a form that an experienced electronics worker can understand. A circuit diagram that could help to show what you mean may be handy here.

3. A resistor is one of the basic components (aside from capacitors, inductors, transistors, voltage sources, etc) having a parameter (or parameters) that can be used in an electronic circuit to make the circuit behave in a desired way - usually by design. It all depends on what components are used, or need to be used, and how those components are connected together. Circuit theory tutorials or courses will certainly help with getting started in understanding how these components can be used to design electronic circuits.

If you look at the formula, V = I R, you can re-arrange it mathematically.... such as V/R = I. So, as you can see, if the voltage across the resistor is some known value, then you can control the amount of current through that resistor by means of the 'resistance' of the resistor. So if you have an ideal 9 Volt battery (that is hypothetically assumed to have no internal resistance of its own) connected across a resistor, then the current flowing through the resistor is relatively small if the resistor you choose is relatively large (when the resistance value of the resistor is compared with the value of the voltage).

ardy_guy:
Except that it's where "i" for current comes from.

True the conventional symbol for current is I, which originates from the French phrase intensité de courant, meaning current intensity.

But this forum is in English, and it is confusing the OP because from his questions he considers it as something different from voltage, current and resistance.

I understand almost everything. Imagine if you had a 5V 20ma power supply and you want to power up a 20ma led with a voltage drop of 1,8 V. You have the correct amount of current, but the voltage is too high. Do you need a resistor? If so, why? If not, is it really OK to power up an led with 5V when its voltage drop is only 1.8?

You're mixing up the ratings (specifications) of your components with what happens when you connect (wire) then up in a circuit.

• Ohm's law determines what happens in your circuit with regards to current, voltage and resistance.
• The ratings of the components determines the safe operating limits you need to adhere to.
All of the numbers in your comment above are specifications of your components.

For a 5V 20ma power supply, lets assume there's no overload protection. If you accidently short out the output, it will temprarily deliver much more than 20mA then fail.

For the 20mA LED, this is the maximum reccommended operating current. Increasing current above this level reduces life expectancy and increases chance of failure.

The LED voltage drop (VF) of 1.8V is a specification for the type / colour of LED you have. This is fairly consistent for any level of current through the LED. If you connect a 5V supply directly to the LED, you're forcing VF to become 5V which will immediately blow the LED.

What's the solution? Use Ohm's Law to protect your components (keep within specifications) and to set your desired operating level ... in this case its current you need to control.

For example, if you need the LED to illuminate at about 50% of it's maximum brightness, then you would need 10mA of current. To get 10mA of current, you need to calculate the series resistor required. Using Ohm's law, this would be R = V/I = ( 5-1.8 ) / 0.01 = 320Ω

You could use the next closest standard resistor of 330Ω to get very close to the desired brightness. You could re-calculate using Ohms's Law to determine the actual current. It would be I = V/R = 3.2/330 = 0.0097 = 9.7mA.

Good understanding Grumpy, he does mean as you pointed out the current but is phrasing it in such a way that could be confused with the intensity of the led.I understood him earlier but did not pick up on his chosen word , just assumed what he was trying to communicate.

Thanks guys!