Delaymicroseconds not accurately working?

The problem I'm having is that when delayMicroseconds is used I get all sort of finicky RPM on the output of this thing. When I use delay (and divide by 1000) it works but it doesn't smoothly transition
because i need the resolution of microseconds, not milliseconds. For example, 9000 RPM has a delay between spark of 0.01333 seconds, or 13.3 ms [1/(9000/60)*2]. delay doesn't accept decimals so only 13 would be recognized.

Is there something I'm overlooking?

int engspeedV=analogRead(A5);
engspeedV = map(engspeedV, 0, 1023, 1000, 9000); //sim engine speed with a variable voltage input
double period=spark(engspeedV); //converts the engine speed to a time slice per second
  
analogWrite(sparkPin,1023); //Coilpack activate and discharges
delay(5); //discharge period of 5ms
analogWrite(sparkPin,0); //end discharge period
double finalDelay=(period-5)*1000); //the period of wait before the other discharge cycle  fires again

delayMicroseconds(finalDelay); //try to apply the delay here

We need to see the whole sketch to see what might be affecting things.

delayMicroseconds(finalDelay); try 'unsigned int' for finalDelay
"the largest value that will produce an accurate delay is 16383"

BTW delayMicroseconds has a granularity if 4us.

int sparkPin=7;

void setup() {
  // put your setup code here, to run once:
pinMode(sparkPin,OUTPUT);
pinMode(13,INPUT);
pinMode(A5,INPUT);
Serial.begin(57600);

}

void loop() {
 
int engspeedV=analogRead(A5);
engspeedV = map(engspeedV, 0, 1023, 1000, 9000);
   double period=spark(engspeedV);
  
  analogWrite(sparkPin,1023);
delay(5);
analogWrite(sparkPin,0);
double finalDelay=round((period-5)*1000);
//delayMicroseconds(finalDelay); //counting up rpm

Serial.println(finalDelay);
 
}

double spark(double rpm)
{
double hz = (1.0/60)*rpm/2;
double slice_duration = 1.0/hz;
return slice_duration*1000;


}

Delaymicroseconds() does not take a double as an argument but an unsigned long

Delaymicroseconds() does not take a double as an argument but an unsigned long

Yeah, but it should convert automatically for you...

You may be running into the fact that floating point calculations (and also analogRead) take quite a long time compared to microseconds - 150+ cycles for simple operations, ~500 cycles for divide, up to 3000 cycles for some of the math functions...
https://www.nongnu.org/avr-libc/user-manual/benchmarks.html

HOLY SMOKES

I changed analogWrite to digitalWrite and now it works PERFECTLY!

You code and you learn. Thanks guys!

Lesson learned. Always use digital write when possible.