Unwanted retard in my code

In my code I have an unwanted retard that i can't figure out. I want the output to stay HIGH for 80ms and then go LOW for 2ms but instead it goes LOW for 3.6ms. Can someone figure out why?
Here is the code where i put a HIGH after the 2ms LOW so that I could measure how much retard I had before

for (int s = 0; s < 8; s++) {
        digitalWrite(output, HIGH);
        delayMicroseconds(10000);
      }
      digitalWrite(output, LOW);
      delayMicroseconds(20000);
      digitalWrite(output, HIGH);


for context here is the whole code but I don't think it's needed

const int sensore = 0;
const int LED = 13;
const int output = 12;
const int terra = A5;

int serialCode[9] = { 1,1,0,1,0,1,0,1,0 };
unsigned long object;
bool objectDetected = false;
void setup() {
  pinMode(sensore, INPUT);
  pinMode(output, OUTPUT);
  pinMode(LED, OUTPUT);
  digitalWrite(output, LOW);
  analogWrite(terra, 180);
}

void loop() {
  //led alto se il sensore non vede nulla
  if (digitalRead(sensore) == LOW) {
    digitalWrite(LED, HIGH);
  }
  if (digitalRead(sensore) == HIGH) {
    digitalWrite(LED, LOW);  //led basso se il sensore vede un corpo
    object = millis();
    while (object + 71 > millis()) {  //verifica segnale alto per 70ms
      if (digitalRead(sensore) == LOW) {
        break;
      }
      if (object + 70 == millis()) {
        objectDetected = true;
      }
    }
    if (objectDetected) {
      //Carrier Wave;
      for (int s = 0; s < 8; s++) {
        digitalWrite(output, HIGH);
        delayMicroseconds(10000);
      }
      digitalWrite(output, LOW);
      delayMicroseconds(20000);
      digitalWrite(output, HIGH);
      //Manchester encoding
      for (int i = 0; i < 9; i++) {
        if (serialCode[i] == 0) {
          digitalWrite(output, HIGH);
          delayMicroseconds(1000);
          digitalWrite(output, LOW);
          delayMicroseconds(1000);
        } else {
          digitalWrite(output, LOW);
          delayMicroseconds(1000);
          digitalWrite(output, HIGH);
          delayMicroseconds(1000);
        }
        if (i == 8) {
          digitalWrite(output, LOW);
          delay(500);
          objectDetected = false;
        }
      }
    }
  }
}

Yes my mistake about the 20ms. Where is millis interrupted tho?

All of the system timing etc is done with this interrupt. It runs independent of the user code but it will take some program time each time it fires. It can be turned off but all of the timing functions etc will stop. Slow down your horizontal and time the delays, it should match with the interrupts. A simple explanation of an interrupt is when it is tripped it saves the state of the processor does its thing, then resonators the processor state and resumes where left off.

According to the delayMicroseconds reference page:

Currently, the largest value that will produce an accurate delay is 16383; larger values can produce an extremely short delay. This could change in future Arduino releases. For delays longer than a few thousand microseconds, you should use delay() instead.

20000-16383 is 3617, corresponding to a delay of ~3.6mS.

2 Likes

Which Arduino?

I think that is it thank you very much

which Arduino?

I'm programming an atmega328p with external 16 MHz using an Arduino Uno.

Why not a PWM pin?

Should I move terra to a pmw Pin? It's just a constant signal that acts as ground for a different circuit.

Why break in a while circuit?

Because I want the code to activate only when the signal is HIGH for more then the 70 ms if it's high for less I do not want the code to be executed

Thanks for the subtraction tip

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.