Go Down

Topic: Resistors, Voltage vs Current (Read 855 times) previous topic - next topic


Feb 09, 2013, 07:21 am Last Edit: Feb 09, 2013, 07:26 am by afanasyevich Reason: 1
May I ask about the metering device (multimeter) that you use? Almost multimeter had error reading (usually described on user manual). It also have input impedance, usually analog voltmeter have 20kohm input impedance. With your 10k resistor its about 50% of input impedance. Almost digital voltmeter comes with 100kohm or greater. Theoretically, greater input impedance voltmeter give you better measurement. Assume you use digital voltmeter with 1Mohm, its about 1% but still affect your result.
Other analysis, may your resistor relative value about 5% (with gold color sign), More precision and accurate resistor will give you better result. Use resistor with 1% relative value or smaller.


Ok i'm going to investigate and see and get back :)


I heard Ohm's Law was going to be repealed....

Faster than light electrons..........Then again, electrons might not be fundamental particles in 20 years

James C4S

I heard Ohm's Law was going to be repealed....

Ohm's Law isn't a suggestion, like speed limits.
Capacitor Expert By Day, Enginerd by night.  ||  Personal Blog: www.baldengineer.com  || Electronics Tutorials for Beginners:  www.addohms.com


Many power sources are not ideal - but can be thought of as like an ideal voltage source in series with an "internal resistance" - thus
taking current from them causes the voltage to fall.  Big well regulated power supplies will have a very low internal resistance (well
below an ohm), cheap watch battery may have 100's of ohms of internal resistance.
[ I won't respond to messages, use the forum please ]

Go Up