for my lab, i'm building a 4 channel power supply, with two digital DC buck converter (0-36V) and two analog (from different suppliers) XL4015 with display DC buck converter (1.2 to 36V) .
I have two DMM, one Fluke and one ordinary.
By turning on, i get differences, where the DC converter show 14V5, the Fluke shows 14V8 and the ordinary shows 14V7 as example.
And these differences appears on all 4 channels (buck converters).
Is this normal, a difference of 2% to 4% by measuring what the buck display and what the DMMs shows?
Nothing is perfect in the analog world (including the analog-to-digital converter built into your meters).
Do you have the specs for the meters? My Fluke at work is specified as 0.5%, +2 counts on the DC range. It gets calibrated once a year by an outside calibration lab.
14.7 & 14.8 is a difference of one-count (if that's all the resolution you have). And you might be on the "hairy edge" between two counts.
I assume the display built-into your DC-DC is "approximate" with no actual accuracy specification.
Believe the Fluke. If you are worried about these differences get the meters calibrated. You also need a decade more in your measuring device then what you are measuring.
The absolute difference is comparable, but the relative difference is not...
3V0 3V3 is a 10% difference....
Your meters are much closer than that. So: do not worry!