Is delay(0) defined?

Hi

I’ve been playing with the example sketch AnalogInput and it doesn’t behave as I would expect:

Turning the pot on the analogue input increases the flashing rate as expected until it is near the end of its travel when the led is extinguished instead of remaining on at half power.

I have checked the analogread result by sending it to the serial monitor and find that the led remains off when the result is 0.

Is there a problem with having a delay of zero in the for loop?

Russell

Is there a problem with having a delay of zero in the for loop?

Other than being useless? Not really.

I have checked the analogread result by sending it to the serial monitor and find that the led remains off when the result is 0.

Well, perhaps it's time to share the code.

Have a look at several things at a time which illustrates how to use millis() to manage time.

...R

russellz:
Is there a problem with having a delay of zero in the for loop?

As far as I can remember, only delayMicroseconds() has a problem with a parameter of 0, but delay() has not.

So please show your complete code, then perhaps somebody on this forum can find out what's going on with your sketch.

It’s just the analogue read example file. I have added a couple of lines to send “sensorValue” to the serial monitor to try to see what’s going on:

 http://arduino.cc/en/Tutorial/AnalogInput
 
 */

int sensorPin = A0;    // select the input pin for the potentiometer
int ledPin = 13;      // select the pin for the LED
int sensorValue = 0;  // variable to store the value coming from the sensor

void setup() {
  // declare the ledPin as an OUTPUT:
  pinMode(ledPin, OUTPUT);
  // initialize serial communications at 9600 bps:
  Serial.begin(9600); 
}

void loop() {
  // read the value from the sensor:
  sensorValue = analogRead(sensorPin);
  // print the results to the serial monitor:
  
  Serial.print("\nsensor = " );                       
  Serial.print(sensorValue);      
  // turn the ledPin on
  digitalWrite(ledPin, HIGH);  
  // stop the program for <sensorValue> milliseconds:
  delay(sensorValue);          
  // turn the ledPin off:        
  digitalWrite(ledPin, LOW);   
  // stop the program for for <sensorValue> milliseconds:
  delay(sensorValue);                  
}

My question is why does the LED go out, instead of being on at reduced power, when “sensorValue” reaches zero?

I’m just a little confused by that.

Russell

Are you asking why, if you turn the LED on for zero milliseconds, it goes out?

No, [quote author=Nick Gammon link=msg=2083103 date=1423472369]
Are you asking why, if you turn the LED on for zero milliseconds, it goes out?
[/quote]

Err, I see what you mean but zero delay between turning it on and turning it off isn't zero time. It is a few clock cycles. The LED is then turned off again with zero delay before the cycle repeats.

Ah, I think I've got it: There is also an analog read cycle in the loop and I guess that introduces a much more significant delay than the delay(0) instruction so the duty cycle of the LED becomes very small and it appears to go out.

Thanks

Russell.

Yes, analogRead normally takes around 104 µS, so the LED will be off most of the time.