I have a 12 Volts DC gear motor driven by a L298N driver. I noticed, that when I increase the voltage linearly with a ramp signal at the analog output of my arduino up to 12 Volts (1023), the actual applied voltage from the L298N is nonlinear. And accordingly the shaft speed also does not change linearly, although my load torque stays constant.
The attached screenshot shows the ramp input signal with (orange in Volts) and the shaft rotation (blue in rad/s). The applied voltage of my L298N was only measured with a multimeter therefore I don´t have it graphically. But its behaviour looked a lot like the shaft rotation.
Am I wrong to assume a linear behaviour? I know that this is obviously not the case when I apply different torque to the motor shaft. But in my measurement the applied torque stayed the same. Also I could accept a non linear behaviour at the very top end around 11-12 Volts but this is non linear over the whole spectrum...
Can somebody enlighten me, why this could be the case?
PS: In the graph I removed the offset at the beginning, so obviously the motor actually starts spinning at a certain voltage > 0 Volts.