I will ask directly:
How to make sure that a given resistor will limit ONLY the current but NOT the voltage to a system / IC / control pin? What should the arrangement look like?
JMD1:
I will ask directly:
How to make sure that a given resistor will limit ONLY the current but NOT the voltage to a system / IC / control pin? What should the arrangement look like?
Have a look at this and I think you'll gain an understanding. Play around there with various values of resistors, voltage, and current.
I hope this helps.
How to make sure that a given resistor will limit ONLY the current but NOT the voltage to a system
This is not possible. Ohm's Law guarantees that if the current is controlled in any way, the voltage will also change.
What do you really want to do?
I'm not following why you would need to limit current to a system or ic. I can understand CL to a pin or led though.
In general, most things DON'T need current limiting, as it will only use what it needs.
Are we talking about an LED here?
tinman13kup:
I'm not following why you would need to limit current to a system or ic. I can understand CL to a pin or led though.In general, most things DON'T need current limiting, as it will only use what it needs.
So you say that things get as much current as they want?
Well, if I power a LED with 2A 3.3V supply it is gonna burn right? The LED won't suck only 15 or 20mA...
JMD1:
So you say that things get as much current as they want?
Well, if I power a LED with 2A 3.3V supply it is gonna burn right? The LED won't suck only 15 or 20mA...
I know you're not replying to me, but did you look at the ohm's law calculator that I linked to? If you do, you'd see that you need a 220ohm resistor to make that 15mA diode to work right at 3.3v. As the others suggest, "it's all about ohm's law." It has absolutely nothing to do with "available" current.
edit: unless of course the "available" current is less than the 15mA "demand" current in this case.
JMD1:
So you say that things get as much current as they want?
Well, if I power a LED with 2A 3.3V supply it is gonna burn right? The LED won't suck only 15 or 20mA...
LEDs are special cases because they are not Ohmianc restistors. You CAN happily power your Arduino from a 5V 600A source, though. (As long as you don't make any mistakes)
LED also have a U-I curve so your initial question still does not make a lot of sense.
JMD1:
So you say that things get as much current as they want?
Well, if I power a LED with 2A 3.3V supply it is gonna burn right? The LED won't suck only 15 or 20mA...
That is why I asked the question. True an LED is not a linear resistor, that is it does not obey a voltage / current linear relationship.
If you read the data sheet on a LED nowhere does it say it takes a voltage. What it does say is what the voltage drop will be if the current is a specific value.
Bad questions do not get good answers because you miss out so much.
ElCaron:
LEDs are special cases because they are not Ohmian
Ohmic.
aarg:
Ohmic.
Thanks. Not a native speaker.
JMD1:
So you say that things get as much current as they want?
Well, if I power a LED with 2A 3.3V supply it is gonna burn right? The LED won't suck only 15 or 20mA...
I can understand CL to a pin or led though.
Perhaps I wasn't clear enough?
There has to be a specific reason to require limiting current. Led's specifically require limiting current. Some pins may require limiting current. It's all in the specifics, namely the datasheets. If there is no specific need to limit current, then don't.
I see there have been a lot of answers and I know from experience that sometimes it takes a while for someone to "get it" so here's my contribution.
Note: To simplify things I have taken some liberties with details. Its not meant to steer you wrong but help you grasp the basic essentials.
In general items require either a controlled voltage OR a controlled current. NEVER BOTH.
The component datasheet will define which it required but will never state "this required constant voltage" nor "this required constant current". Some will chime in and state there are some that are in the middle. This is true but doesn't help with the basics.
Without exception each component will operate over a range of inputs (either voltage or current, which ever the component requires)
examples:
- Arduino board required a voltage input. Could be in a range of 9 to 12V. It will determine how much current it requires (the amount of current required is in the datasheet).
- IC's in general require a voltage input.
- Motors (in general) require a voltage input.
- A bare LED as a component require a current input. There are not many in this category.
- LED strips require a voltage input (this is because they already include resistor to control the current
Hope this somewhat helps.
I'm trying to think of more practical examples which require a current input.
Bipolar transistors require a current input - there's always a base resistor.
MOSFETs on the other hand are voltage-controlled. Many designs don't have a resistor on the gate or it's a pull-down resistor to assert a known voltage on the gate while the Arduino is resetting.
I'm trying to think of more practical examples which require a current input.
How about:-
Grumpy_Mike:
How about:-
Transimpedance amplifier - Wikipedia
Yes. That looks like a useful method to work with optocouplers.