Hi, so I have a stepper motor with ascending delays in microseconds but this obviously leads to a nonlinear "revolutions per second" output.

This means that my useful range is quite small and also the speed rises very quickly toward the smaller delays.

I made a table in Excel of delays from 50us up to 1400us and have a speed graph like this:

I want the user speed input to feel more linear so I want to say map the values 0-100 onto a set of delays resulting in a nice linear speed change.

I know this is more of a maths issue than a code problem but any help would be appreciated.

Currently my calculation is: 1/(delay + 50us *ppr/1000000) where pulses per rev is 2000

so simplified down to

RPS = 1/(delay*0.002)