Ok, I've got two related problems to discuss here and think the solution to both probably lies in using the correct piece of maths to smooth out the readings I'm taking, but it's not as simple as I'd hoped it would be:
- Low speed interpolation. So I've been building my own servo system, my "encoder" is of such a resolution that it gives one "tick" per degree of the shaft. It is an absolute encoder with analogue functionality and no ability to give timed sharp edges, I cannot read the time at which a change of the position occurs, I can only read what the position is during each iteration of my control loop. The physical sensitivity of the system means that below 1 "degree" I can't hope for any more accuracy. And this is fine at higher speeds, but I want to be able to run at really low speeds too, that is to say speeds where you get less than 1 "degree" (1 tick) of shaft angle change per time-taken-for-one-loop-of-the-control-loop.
The current I provide to (hence torque developed in) the motor is increased or decreased during each loop of the control loop by an amount proportional to the difference between the speed I'm achieving and the speed I want to achieve, I raise current to speed up, I lower it to slow down.
I tried the obvious answer, low pass filtering of the incoming angle reading, but this just results in jumpy behaviour where my controller responds with bursts of speed followed by bursts of slow. A low pass filter should smooth things, instead it makes them jerkier.
I also tried comparing a stored value of the measured angle from 4 or 8 control loops ago, rather than just comparing the angle measured in this run of the loop to the angle measured during the last run. The same issues as with the low pass filtering occured.
Slowing down the control loop feels wrong in principle, but I tried including delays which could be extended when needed at low speeds, and yet a slowed control loop also ends up with slow motions becoming jerky.
How am I to interploate speed when I'm wanting to run slower than 1 measurable tick per minimum measurable time period (the time for one loop of the control loop)?
3.Anti-hunting for servo position holding. When I stop the servo at a position it is required to hold, it ends up constantly jiggling back and forth with a vibration. This isn;t a big motion, it is barely visible, but it can be very clearly heard and seen. The motor is trying every so often to give an extra burst of power, then relaxing itself. I've tried playing with PI loops during position holding
I understand this is a tricky problem overall, it is unique to my overall system I guess, although likely occurs in a similar form in other servo systems. No invididual part of the system, the motor alone, the encoder alone, the source code alone... can demonstrate this issue, it only occurs when the whole thing is (not-quite) working together. But I'd appreciate any advice on how to interpolate the speed measurements so I can make really slow moves smoothly, and suspect the same technique should let me damp out the hunting of the servo during position holding.
Thanks