continuous servo, low speed more power?

Hi there:

I am measuring the power requirements of a continuous servo. When it is running at full speed the power requirement is small... however, when I set the speed to a low value, it eats 0.2 A!

Can anybody explain me why this happens? it seems that running at high speed would need more power.

Thanks in advance.

Perhaps the reason is that low speed for this servos means running a bit, micro stop, running, micro stop...etc..

????

Amperage is based on how much torque the servo is providing and that is the most likely reason, especially given the high gearing in the servo which is a major use of the available power. Beyond that, the FETs in the servo will run less efficiently when at slower speeds; does it seem to get warmer when run slowly?

Beyond that, the FETs in the servo will run less efficiently when at slower speeds; does it seem to get warmer when run slowly?

Are you sure of this?
Do you know what the mechanism of this is?
I can't think why this should be.

As to the original question, it depends on the internal circuitry of the servo. However a DC motor takes more current when it it stalled than when it is running, and more when it is running slowly than quickly. This is because at stop or slow speed the coil resistance is the main factor controlling the current. When a motor is running the alternating current in the coil is limited by the inductive reactance slowing down the rise time in the coil. It changes direction before the full DC current can flow.

Grumpy_Mike:

Beyond that, the FETs in the servo will run less efficiently when at slower speeds; does it seem to get warmer when run slowly?

Are you sure of this?
Do you know what the mechanism of this is?
I can't think why this should be.

Why wouldn't you expect the FETs to run hotter (less efficiently) under the PWM or linear mode that would be required for the slower speed?

Can anybody explain me why this happens? it seems that running at high speed would need more power.

Check the below on the back emf a motor produces when spinng fast that will act to reduce the foward current flow thru the motor.

Why wouldn't you expect the FETs to run hotter ..... under the PWM

Because:-

  1. when the FET / motor is off it should be generating no heat because there is no current through it.
  2. when the FET / motor is on it will be generating heat because of the current flowing through the on resistance of the FET.
  3. When the FET is being turned on / off at say a 50% duty cycle then the heat situation alternates between 1) and 2) and so it runs cooler than 2).

Why wouldn't you expect the FETs to run hotter (less efficiently) under ...... linear mode

Now here you are making a definition of less efficiently that is not valid. In the linear mode it is designed to burn off the excess power and that is what it is doing very efficiently. There is no change in efficiency of the FET, the lack of efficiency is due to the design decision to have the FET burn off excess power.