Hi,
I’m trying to measure how much current my servo motor pulls in order to setup the circuit properly (I have to reduce 9V to 6V for the servo). The servo is a Futaba S3003.
The problem is, I can only get normal behavior when my multimeter’s plug is in the 10A (unfused) connector. I get readings between 0.1 and 0.4 (therefore 100 and 400 mA), but when I plug the cable into the “mA” (fused) connector the servo behaves very strangely, and stops responding to commands.
I’m not sure I’m checking this correctly so please help if you can.
Also, the manufacturer doesn’t list the operating current of the device. Any way to figure this out easily, or do I just need to check different transistor values in series and see which is the minimum current for proper operation?
Thanks.