Hi everyone, I'm trying to do a voltmeter with arduino and a voltage divider, but I don't know how to make the input voltage be variable, for example if I'm gonna measure 10V and later 3V, the voltmeter reacts automatically to that and change, showing the correct voltage.
Can't you measure 3V with your 10V meter? For a 10V meter you need a "divide by 2" voltage divider to get the voltage that is safe for the Arduino analog input. When you put 10V on the divider you will read 5V at the Arduino and multiply that by 2 to display as 10V. If you then put 3V on that divider you will read 1.5V and then multiply buy 2 to display as 3V. What's the problem?
Johnwasser, thanks a lot, actually I didn't understand how to measure it, but your "divide by 2", give me the answer, I'm doing a divide by 6.
Now I have to do an ammeter with 2 resistors on parallel, sorry to ask you so much but is the same idea right?.
Now I have to do an ammeter with 2 resistors on parallel, sorry to ask you so much but is the same idea right?
You only need one resistor to measure current (Ohm’s Law).
Normally, you’d measure the voltage across those 2 resistors and calculate the current through each. (In parallel, the total current is the sum… One of Kirchhoff’s Laws.) If you are required to measure it with the Arduino, you’ll need a 3rd resistor in series with those. Then measure the voltage across the 3rd resistor and calculate the current in your sketch. That 3rd resistor needs to be low-enough in value (compared to the existing resistors) so that inserting it in series doesn’t have much effect on current flow.
But, current is tricky!* Especially with the Arduino!
The Arduino only measures voltage, and it ONLY measures voltage relative to (it’s) ground. That means the resistance has to be on the ground-side of the circuit-under-test. Be careful! If you attach the Arduino’s ground to the wrong part of the circuit-under-test, you can burn-up stuff… If the Arduino is connected to USB, you could potentially fry your computer.
Since when measuring current you generally want the smallest possible voltage-drop, you may want to use the optional 1.1V internal ADC reference (that gives you resolution down to 1mV). That’s assuming you’re using a 3rd ‘current measuring’ resistor.
…Multimeters don’t have a (external) ground, they only have + & - terminals. But, since the meter presents a low resistance load in the current mode, you can fry stuff if you connect it wrong, and the current connection to the meter is often a separate connector and it’s usually fused (so the meter doesn’t get blown, even if you fry your circuit).
Real multimeters are over-voltage protected and reverse-voltage protected (they can read negative voltage and/or reversed connections).
but I don’t know how to make the input voltage be variable, for example if I’m gonna measure 10V and later 3V
Real multimeters either have a range-switch or they are auto-ranging where internal firmware switches resistors and switches the calculation.
* I work in electronics and I can’t remember the last time I measured current (with a multimeter)… It’s rare to measure current. I measure voltage & resistance all the time, but rarely current. But to be fair, the power supply on my bench does have voltage & current meters built-in, so I can “see” the supply current and I can tell if I have a short or “over current” situation.