Thanks to everyone for the answers! So I've bought myself a multimeter (for reference to its reliability I attach its model in case anyone knows it - tell me if these kind of links are forbidden as maybe perceived as advertisements and I'll remove it) and proceeded to make some measurements (i.e., 5V, 3.3V and Vin using different power sources).
First, I've attached my Uno board to my laptop using the USB Type-B port and the pin voltages resulted as follows):
Vin: between 4.66V and 4.67V (moving the jumpers out of the multimeter and in again gave slight discrepancies)
Then, I've plugged a 12V AC/DC adapter into the board's barrel jack and got these results:
Vin: between 15.4V and 15.5V
So one question arises: how comes the 5V output is more accurate with the AC adapter? Is it because, being the USB input less than 5V, Arduino has to "upscale" it? Also, if the adapter is rated as a 12V 500mA output, how comes the Vin is measured as 15.5V? Another adapter, rated 12V 1.5A, gave unchanged results on the 5V output but 11.3~11.4V on the Vin pin, way nearer to the nominal value. Is it a normal consequence of the different current output, or does that mean the first adapter is low quality or corrupted?
To give further data, here is the Vin measured with other 12V adapters, each with a different current output:
Then, I've used a 9V 200mA adapter and here are the results:
Vin: between 10.4V and 10.5V
Lastly, powering the board through a 9V 800mAh Li-ion rechargeable battery which outputs 8.26V (just charged), the rates are the following:
Probably the issue has already been covered (didn't find anything though), but from this quick test it seems like the higher the source voltage, the more precise the 5V outlet will be. The 3.3V one seems not to notice the different sources and is always more accurate. Should that be used as a reference?