PIR sensor with "slow-start"

So I'm using an HC-SR501 PIR sensor in conjunction with an IR LED to send an infrared signal when motion is detected.
When I first started this project I believe detection was immediate as it should be.
However at some point in time, everytime I power up the Arduino it takes a while to start detecting motion, whenever it does though, it doesn't stop detecting, it's like it's warming up for some reason, it's not always the same setup time either, seems to be rather random so far. I tried with 3 different sensors and 3 different arduinos and that bug persists, so perhaps it's something in my code or setup that's causing this, here is my code:

#include <IRremote.hpp>
// Define connection pins:
#define pirPin 2
#define ledPin 13
#define IR_LED 3

// Create variables:
int val = 0;
bool motionState = false; // We start with no motion detected.

void setup() {
  // Configure the pins as input or output:
  pinMode(ledPin, OUTPUT);
  pinMode(IR_LED, OUTPUT);

  // Begin serial communication at a baud rate of 9600:
  Serial.begin(9600);
  
  IrSender.begin(3, ENABLE_LED_FEEDBACK, USE_DEFAULT_FEEDBACK_LED_PIN);
  attachInterrupt(digitalPinToInterrupt(pirPin),motion_r,CHANGE);
}

void loop() {
  // For debug purposes
  Serial.println("Setting up IR!"); 

  // If motion is detected (pirPin = HIGH), do the following:
  if (val == HIGH) {
    digitalWrite(ledPin, HIGH); // Turn on the on-board LED.

//    //START 
    Serial.println("Sending IR!"); 

    // Burst 3 IR cmds every 1 sec while motion is activated
    for (int i = 0; i < 3; i++) {
      IrSender.sendNEC(0x04, 0x3, 1);
      Serial.println(i+1);
    }
    delay(1000);
    
    // Change the motion state to true (motion detected):
    if (motionState == false) {
      Serial.println("Motion detected!");
      motionState = true;
    }
  }

  // If no motion is detected (pirPin = LOW), do the following:
  else {
    digitalWrite(ledPin, LOW); // Turn off the on-board LED.

    // Change the motion state to false (no motion):
    if (motionState == true) {
      Serial.println("Motion ended!");
      motionState = false;
    }
  }
  delay(100);
}


void motion_r(){
  val = digitalRead(pirPin);
  Serial.println("Interrupt!");
}

Anyone has any clue why this happens?
Thanks in advance.
Gabriel

start by taking the serial print out of the interrupt routine and declare val as volatile.
see if that helps..
good luck.. ~q

You are breaking all the rules for using interrupts. val should be declared "volatile byte" and you should never do serial I/O in an interrupt.

Thank you both for your replies, that doesn't seem to be the issue though, I declared val as volatile and took away the serial print (which was only there for debug actually). I'm printing val in the main loop and as I expected it is 0 for a while until it starts being triggered to 1.
I'm yet to find the cause for this delay in detection.

Can you post the new code with this modification you made?

It's a dumb idea to use interrupts for a motion sensor.
Only use interrupts for things that change faster than the loop() time.
A slooooow PIR sensor doesn't fall under that category.
Just poll the PIR.
Leo..

It does seem to work with polling.
However I've started this project by using polling only and switched to interrupts due to power consumption, since it is expected that this application will be inactive for a while I used interrupts to only wake up the CPU when movement is detected, and I'm pretty sure this didn't happen when I first tried it.
I'm not sure why this is but it might be some kind of interference with the pre-programmed detection modes on the HC-SR501 and my code.

Still thanks for the help.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.