Hello! I am writing a program in which I have to delay a signa a certain amount of microseconds (usually between 100us to 1000us).
My issue is that I don't know if this will work, because I need to calculate the amount of microseconds to delay, and I am afraid that this calculations might be taking more than the amount of microseconds I need to delay, making the whole thing pointless.
To check if this was correct, I created a program:
int COUNTER = 0;
int V = 0;
int CALC = 0;
void setup() {
Serial.begin(300);
pinMode (A5,INPUT);
}
void loop() {
V = analogRead(A5);
if (COUNTER >= 1000 && COUNTER <= 1100){
Serial.print ("START,");
Serial.println (micros());
CALC = V * 360 * 56 + 3 * 500 - 4;
Serial.print ("STOP,");
Serial.println (micros());
}
delay(10);
COUNTER = COUNTER + 1;
}
This is simply a code that simulates the real code I'm going to run. I created this code because I cannot run the real code as the Arduino is not yet connected.
The results I got where the following:
START,10121356
STOP,10121716
START,10132196
STOP,10132560
START,10143044
STOP,10143400
This is telling me that to excecute this code I am taking about 360us in average.
Is this correct? If this is correct, then this delay would be useles for cases in which the calculated delay is <400us, and it would be augmented by a lot in cases where the time is above 400us, since the calculations would almost double the delay time.
Am I doing something wrong?