Photo & Motion sensor issue

The code functions with every object working, but I have a delay issue that I cannot figure out and I believe it's something in the loop aspect of the code.

Desired effect is that a photo sensor detects light. If no light, detect motion. On motion, fade LED(s) up. After specified delay, fade LED(s) down. repeat.

What's happening is that the LED(s) are staying on for about 90 seconds despite a 1ms delay (actual desire is 5000ms delay). It does eventually fade out as intended and shut off when exposed to light. Everything but the delay is working as intended.

I'm expecting that the following code functions as a delay to keep the LED(s) on, once initiated; however, when I set it to "0", I see that the LED(s) are fading up and down over and over without motion. I did something wrong and I'm not seeing it. It's so damn close, but not there yet.

const long interval = 1

I set it to "1" because everything larger is just more unnecessary delay to seeing a reaction.

The full source code for the project

#define LEDPIN 3
#define PIRPIN 2
#define LDR 0

int ldrValue = 0;
int ledState = 0;
unsigned long previousMillis = 0;
unsigned long currentMillis = 0;
boolean alreadyfadedup  = false;
boolean alreadyfadeddown  = false;
const long interval = 1;  //amount of time the leds will stay up in millis

void setup() {
  pinMode(PIRPIN, INPUT);
}

void loop() {

  ldrValue = analogRead(LDR);

  currentMillis = millis();

  if(ldrValue <= 512) {
    analogWrite(LEDPIN,0);
  }
  else {

  if (digitalRead(PIRPIN) == 1) {
    ledState = 1;
    previousMillis = currentMillis;
    if (alreadyfadedup == false) {
      for (int i = 0; i < 256; i++) {
        analogWrite(LEDPIN, i);
        delay(17);
      }
      alreadyfadedup  = true;
      alreadyfadeddown = false;
    }
  }
  if (currentMillis - previousMillis >= interval) {
    if (ledState == 1) {
      if (alreadyfadeddown == false) {
        for (int i = 255; i >= 0; i--) {
          analogWrite(LEDPIN, i);
          delay(17);
        }
        alreadyfadeddown  = true;
        alreadyfadedup = false;
      }
      ledState = 0;
    }
    else{
      analogWrite(LEDPIN,0);
  }
  }
}
}

your fading code is blocking, so there is little point in trying to make the 1ms interval non blocking..

just make the full thing blocking

const unsigned long interval = 1;  //amount of time the leds will stay up in millis

•••

// in loop, upon detection

for (int i = 0; i < 256; i++) {
  analogWrite(LEDPIN, i);
  delay(17);
}

delay(interval);

for (int i = 255; i >= 0; i--) {
  analogWrite(LEDPIN, i);
  delay(17);
}

if you want non blocking code, then use a state machine. Here is a small introduction to the topic: Yet another Finite State Machine introduction

Thank you for the response. That delay is incrementally increasing/decreasing the brightness of the LED(s). That’s keeping the LED(s) active longer than expected?

The problem in your code is that you always check the PIR pin and in which case you directly update previousMillis

    if (digitalRead(PIRPIN) == 1) {
      ledState = 1;
      previousMillis = currentMillis;

if your PIR is triggered, it will stay on for some time (usually you have a potentiometer on the PIR to fine tune that duration, it can be a few seconds or can be minutes)

so if you set up your PIR say for 90 seconds, during those 90 seconds you keep updating previousMillis and the next if

    if (currentMillis - previousMillis >= interval) {

does not trigger

only when the PIR goes down you stop updating previousMillis and so 1ms later (because interval is 1ms) you manage to enter that if and perform the fade down.


but as I wrote in the first answer, because you block for most of the time, the minimal delay between the two fades is just irrelevant... either do a full blocking code or go for a state machine where the is no delay whatsoever

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.