In your case, the best approach would be to measure the voltage across the resistor and calculate the current using [u]Ohm’s Law[/u] (Current = Voltage/Resistance).
Since the LEDs and resistor are in series, the same current flows through the resistor as the LEDs. Note that you can’t use Ohm’s Law directly with the LEDs because the LEDs are nonlinear… The resistance of the LED changes when the voltage changes.
We rarely measure current directly. Mainly because you have to break the circuit to make the current go through the meter. I work in electronics, and I measure voltage & resistance every day. I can’t remember the last time I’ve measured current. (Except there are voltage & current meters built-into my bench power supply, and it’s often helpful to see if a board under test is drawing excess current.)
There is also a fuse inside the meter, and if you have a short (or something else causing an unexpected amount of current) you can blow the fuse in your meter.
Electricians use a “clamp on” current meter to measure “house current” without breaking the circuit or actually touching the wires. (I think these may work magnetically, but they might have a hall-effect sensor.)