Forgive my noobness, but I have a question about the basics of how the arduino works. I have a method that runs a for loop with a digitalwrite and a delayMicroseconds. What I'm seeing is that given the size of the boundary the execution time gradually takes longer. Maybe this is just simply how it works, but I was expecting it to be linear.
Here is the sample code:
// JUST FOR TESTING FOR Arduino drift
// This is what calls the method and does the calculation:
long msecStart = millis();
rotate(steps,speed);
long msecFin = millis();
float delta = (float)(msecFin-msecStart);
Serial.println(delta/1000);
void rotate2(long steps, float speed){
float usDelay = (1/speed) * 70; // Its 140 for this example. speed=.5
for(int i=0; i < steps; i++){
digitalWrite(3, HIGH); // There isn't anything connected to pin 3.
delayMicroseconds(usDelay);
digitalWrite(3, LOW);
delayMicroseconds(usDelay);
}
}
I set the step value (boundary) to 10000, 20000, 40000, 60000. Here are the results when running the method above;
10000 takes 2.95 secs to execute
20000 takes 5.89 secs to execute (1/100 of a second longer)
40000 takes 11.78 secs to execute (2/100 of a second longer)
60000 takes 17.66 secs to execute (4/100 of a second longer)
What I was expecting is that 20000 should be 2.95 x 2 = 5.90 seconds. Maybe I'm nitpicking, but I need accuracy. So perhaps I'm calculating elapsed time wrong, perhaps millis() isn't accurate, perhaps there is just lag executing digitalwrite so many times.
Thoughts?