Dummy question about voltmeter

Hi community,

I'm a newbie in electronics and my first steps make me confused.

I have a power adapter with input 230V~/50Hz 56mA that makes output to 12VAC/500mA 6VA.
(Exactly: [tt]AC/DC Adapter AVM03036 Model: PW41/20 DC -EU 12 -0500. Input: 230V~ / 50Hz 56mA. Output: 12V = 500mA 6VA[/tt]; it is the power adapter from my fritzbox and I don't use this fritzbox anymore.)

I just want to make some experience in electronics in future.
My first step now was to measure the output voltage by a voltmeter. I connected "+" (the blackwhite cable) of the output cable with "+" of the multimeter and "-" (only black cable) of the output cable with "-" of the multimeter. At the multimeter I select VAC (scale to 200V).

I expected to see about 12 in the display of the voltmeter but I see constantly 22,7.

Where is my error of reasoning?

I know this must be a very primitive question...

:slight_smile: Can anyone outthere help me? :slight_smile:

Kindly regards,
Andy

Well, the specs you've posted for the adapter say it's an AC/DC adapter.
Meaning it outputs 12V DC, not AC, and as such you should set your multimeter to DC measurement :slight_smile:

Many of these AC/DC adapters require a load to go into regulation. Just measuring "open-ended" isn't valid. You should connect a load (e.g. a resistor) when measuring.

Hi community,

thanks for your replys!

@Rejected666: I'm very sorry, I made a mistake in writing. At the top of the adapter is written
"12V (then the DC-sign (line at the top;three dotted points at the bottom)) / 500mA 6VA". So I think the output is DC, right?

@James C4S: I also made the same mistake above: I confound the AC sign with the DC sign. At the voltmeter I have chosen DC. I also instslled a led with a resistor and the result was nearly the same yesterday.

Very very very strange: Today I made the same measurement. When I chose the scale 200V at DC (!), the display shows 283! And there is NO dot anywhere in the display. It's really distressing for me to say this strange things. I think I must go to an electronic system shop to ask about this strange thing. I'm really confused! But thanks a lot for your answers!!!

Kindly regards,
Andy

Perhaps your multimeter is faulty?

Measure a AA battery. 1.5V DC should be 1.5V DC.

When in doubt, measure a known item. As suggested, the 1.5V is a good choice. So would the 5V pin on your Arduino.

Hi community,

a very good idea about testing a battery.

I take a new AAA battery with 1.5V and choose up to 20V DC at the scale of the voltmeter.
At the display says: 2.37!!! (This time there was a dot in the display!)
Measurement was without a resistor.

With a 2.2KOhm-resistor display shows 2.39.

So I think the voltmeter is out of order. Do you think so too, community? :astonished:

I can't believe... The very first steps in electronic for me and I must have a broken equipment! :*

Kindly regards,
Andy

I once had a very cheap multi meter which initially gave very odd readings, including some with two decimal points. The cause was that the display was slightly out of alignment with the pcb. I undid the screw holding the pcb in place and gently moved the display from side to side until I got a reading that made sense!

zamunda:
I take a new AAA battery with 1.5V and choose up to 20V DC at the scale of the voltmeter.

Was that the most sensitive setting you could use?

I can't believe... The very first steps in electronic for me and I must have a broken equipment!

That's FRUSTRATING!!! The good news is, it's rare! Test equipment tends to be reliable. If it breaks, it's usually a broken test-lead or something, and it's usually more obvious, like the reading never changes, or it's just "dead".... Test equipment can go "out of calibration", but it usually doesn't drift too far, and hobbyists rarely worry that kind of precision/accuracy. Your meter is not out of calibration. Its broken or defective...

When I chose the scale 200V at DC (!),

FYI - It's generally safe to use a lower scale, even if you don't know the voltage. i.e. The meter should not be damaged if you are on the 20V scale and you connect to 200V, or if you are reading Ohms and you accidently connect to 200V, etc. You should just get some sort of "overload" display.

The exception is current. That's why there is usually a special connector on the meter for measuring current. Too much current, or "shorting" the current-probes across a voltage-supply can blow the current-fuse in the meter, or it can damage the circuit you are measuring. (This has nothing to with your problem... With a
blown current-fuse, the meter will still measure voltage & resistance.)

Hi community,

a have bought a new digital multimeter. And this time a more expensive one. :slight_smile:

And I can't believe: the measurement of my 12VDC adapter shows 16VDC in the display. (Also
when I connect a resistor.) I'm not satisfied with the delta of 4V, but I think this is tolerant.
(Or isn't it?)

XDThank a lot to all who helped me by solving at my first problem!!! XD

Kindly regards,
Andy

Try loading it with 500ma(24ohm) load, or even mayb
e a 220 ohm resistor it will probably drop, caution the resistor will get hot

zamunda:
Hi community,

a have bought a new digital multimeter. And this time a more expensive one. :slight_smile:

And I can't believe: the measurement of my 12VDC adapter shows 16VDC in the display. (Also
when I connect a resistor.) I'm not satisfied with the delta of 4V, but I think this is tolerant.
(Or isn't it?)

XDThank a lot to all who helped me by solving at my first problem!!! XD

Kindly regards,
Andy

What you are seeing is pretty normal for most unregulated AC/DC power modules. The output voltage is generally rated for when the module is supplying it's maximum rated current, and it's output voltage will increase with decreasing load current being drawn. That is the reason we use voltage regulator devices on circuits where the voltage must not change with variable current flow.

Lefty