Arduino Uno Ignoring Delay Function Within For Loop? (I think)

Hello, I am a beginner Arduino user. I recently picked up the hobby and this is my first forum post. I have a beginner kit by DFROBOT called "Beginner Kit for Arduino." This kit includes an Arduino Uno. I am currently on the 4th project of this kit, which is about an LED fading on and off (picture of circuit attached), and I found something interesting. In the following code, it takes one second (1000ms) to fade an LED on and another second to fade it off, and it keeps looping, which is expected.:

int ledPin = 10;

void setup() { 
    pinMode(ledPin,OUTPUT);
}

void loop(){ 
    fadeOn(1000,5);
    fadeOff(1000,5);
}

void fadeOn(unsigned int time,int increment){
    for (unsigned char value = 0 ; value < 255; value+=increment){
      analogWrite(ledPin, value); 
      delay(time/(255/increment));
      }
}
void fadeOff(unsigned int time,int decrement){
    for (unsigned char value = 255; value >0; value-=decrement){
      analogWrite(ledPin, value);
      delay(time/(255/decrement));
      }
}
//You will see the LED getting brighter and fading constantly after uploading the code.

Me being me, I decided to mess around with the increment/decrement values, which is when I noticed something strange with the speed of the fade in and out. Looking at the code, no matter what the increment/decrement value is, each fade in and fade out should take one second, right?

Well, I found out different factors of 255 to use for the increment/decrement values and conducted a short experiment. I tried different factors of 255 as the increment/decrement values and timed how long it took for the Arduino to complete the cycle of fading into bright and then fading off 10 times. Strictly following the code, 10 on and offs should take 20 seconds no matter what the increment/decrement value is, because a fade to bright was given 1000 ms and the fade to off was given 1000ms as well. Repeating a 2 second fade on/off cycle ten times should result in a 20 second total.

After a couple trials, I found that when I use factors of 255 greater than or equal to 5 as the increment/decrement value, it takes the expected 20 seconds to fade on and off 10 times. However, when the number 1 was used as the increment/decrement value, it took approximately 15 seconds to finish the 10 fade cycles, and that makes no sense! The following code takes 15 seconds for 10 fade cycles instead of the 20 seconds expected. If you look at the code and do the math, it should still take 20 seconds for 10 fade cycles:

int ledPin = 10;

void setup() { 
    pinMode(ledPin,OUTPUT);
}

void loop(){ 
    fadeOn(1000,1);
    fadeOff(1000,1);
}

void fadeOn(unsigned int time,int increment){
    for (unsigned char value = 0 ; value < 255; value+=increment){
      analogWrite(ledPin, value); 
      delay(time/(255/increment));
      }
}
void fadeOff(unsigned int time,int decrement){
    for (unsigned char value = 255; value >0; value-=decrement){
      analogWrite(ledPin, value);
      delay(time/(255/decrement));
      }
}
//You will see the LED getting brighter and fading constantly after uploading the code.

To test this further, I made a separate sketch and used digital writes instead of analog writes to see if analog write was causing a problem. In the following code, it once again takes 15 seconds for what should've taken 20 seconds, so it wasn't a problem with the analog write, but some sort of issue with delay functions inside the for loop:

int ledPin = 10;

void setup() { 
    pinMode(ledPin,OUTPUT);
}

void loop(){ 
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
}

void fadeOn(unsigned int time,int increament){
    for (unsigned char value = 0 ; value < 255; value+=increament){
      //analogWrite(ledPin, value); 
      delay(time/(255/increament));
      }
}
void fadeOff(unsigned int time,int decreament){
    for (unsigned char value = 255; value >0; value-=decreament){
      //analogWrite(ledPin, value);
      delay(time/(255/decreament));
      }
}
//Even with digitalwrite, it still takes 15 seconds to complete 10 on and off cycles.
//Theoretically, should've taken 20 seconds, not 15.

Thanks for bearing with me! Could anyone please tell me what's going on? What is the problem? The math checked out, so why does an on and off fade cycle with an increment/decrement value of 1 take 15 seconds instead of 20? I hypothesize that the Arduino simply can't keep up with 255 loops of code in just one second especially with delay functions in each of them. Yet, as I mentioned earlier I'm just a beginner, so I can't draw any educated conclusions. It would be great if someone with more experience than me could try to figure this out, because frankly, I'm quite curious! :slight_smile:

(Note FYI: I use the create.arduino.cc online IDE to code my Arduino Uno)

//Even with digitalwrite, it still takes 15 seconds to complete 10 on and off cycles.

How and where are you timing the 10 on/off cycles ?

Does

    delay(time / (255 / decreament));

evaluate to the value that you expect, bearing in mind that it uses integers for its calculations ?
Have you tried printing what it evaluates to ? Is it what you expect ?

UKHeliBob:

//Even with digitalwrite, it still takes 15 seconds to complete 10 on and off cycles.

How and where are you timing the 10 on/off cycles ?

Does

    delay(time / (255 / decreament));

evaluate to the value that you expect, bearing in mind that it uses integers for its calculations ?
Have you tried printing what it evaluates to ? Is it what you expect ?

I timed my on/off cycles with the stopwatch on my iPhone. I began the stopwatch at the start of the program when the LED turned on, and ended it after the 10th on/off cycles. I guess I should've used the Millis function.

Also, I have not tried printing the delay, I don't know how to troubleshoot it that way, but I did the math, and if the increment is one, then the for loop should run 255 times, and in each loop, it should do a delay of 1000/(255/1) ms. = 1000/255. (1000/255)*255 = 1000. If each on/off cycle takes 2 seconds (one sec for on one sec for off) it should take minimum 20 seconds for 10 cycles.

I'm pretty new to Arduino and I don't know any good methods to test this properly. Could you please describe what code to write?

-Thanks in advance, it would be a big help!

Also, I have not tried printing the delay, I don't know how to troubleshoot it that way,

Put

Serial.begin(115200);

in setup() then

      Serial.println(time/(255/increament));

immediately before the delay() and set the Serial monitor baud rate to 115200 when you open it

UKHeliBob:
Put

Serial.begin(115200);

in setup() then

      Serial.println(time/(255/increament));

immediately before the delay() and set the Serial monitor baud rate to 115200 when you open it

I made those changes to my code:

int ledPin = 10;

void setup() { 
    Serial.begin(115200);
    pinMode(ledPin,OUTPUT);
}

void loop(){ 
    fadeOn(1000,1);
    fadeOff(1000,1);
}

void fadeOn(unsigned int time,int increment){
    for (unsigned char value = 0 ; value < 255; value+=increment){
      analogWrite(ledPin, value); 
      Serial.println(time/(255/increment));
      delay(time/(255/increment));
      }
}
void fadeOff(unsigned int time,int decrement){
    for (unsigned char value = 255; value >0; value-=decrement){
      analogWrite(ledPin, value);
      Serial.println(time/(255/decrement));
      delay(time/(255/decrement));
      }
}
//You will see the LED getting brighter and fading constantly after uploading the code.

I ended up with a serial monitor that looks like this:
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
… (3 just rapidly repeats indefinitely)

Using the serial monitor I also conducted another experiment. I used the Millis() function to print how many ms it took to finish 10 cycles since the program starts.

With the increment/decrement value of 1 I wrote the following code:

int ledPin = 10;

void setup() { 
    Serial.begin(115200);
    pinMode(ledPin,OUTPUT);
}

void loop(){ 
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,1);
    fadeOff(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    digitalWrite(ledPin,0);
    fadeOn(1000,1); //This is supposed to be just a 1 second delay, i commented out the analogwrite portion in the function definition.
    Serial.println(millis());
    // The code above is 10 ons and 10 offs. There is 1 second of delay after each on and off. The total should be 20 seconds, but according to the millis function which prints a value at the end, it all gets done in about 15 sec if you use 1 as the increment.
}

void fadeOn(unsigned int time,int increament){
    for (unsigned char value = 0 ; value < 255; value+=increament){
      //analogWrite(ledPin, value); 
      delay(time/(255/increament));
      }
}
void fadeOff(unsigned int time,int decreament){
    for (unsigned char value = 255; value >0; value-=decreament){
      //analogWrite(ledPin, value);
      delay(time/(255/decreament));
      }
}

On the serial monitor, it displayed 15340 as the number of milliseconds it took to complete 10 on/off cycles with an increment/decrement of 1. With an increment/decrement of 3, it took 18713 ms. With an increment/decrement of 5 it took 19387 ms. With an increment/decrement of 15 it took 19722 ms. With an increment/decrement of 17 it took 19802 ms. With an increment/decrement of 51 it took 20000 ms. With an increment/decrement of 85 it took 19980 ms. If the increment/decrement is 255, the for loop loops just once, and when I used 255, it took 19999 ms to complete 10 on/off cycles. I still can’t understand why all increment values don’t cause it to take 20 seconds. Most increment values actually make it take less than 20 seconds.

old programmer-wisdom:

the bug has its hands on the keyboard and ist staring at the screen

a microcontroller always does what you have programmed. Though the program does something different than you expected you don't understand yozur own program (yet)

you just have to caclulate on a pocket-caclulator or smartphone an bear in mind that you are using integer-variables which means the decimals behind the decimal-point get truncated

example:

1000/(255/15) =
as delay uses integers
the delay(time/(255/increament));

does delay(58); // not delay(58.823529412)

your for-loop has the ending-condition value < 255
this means

1: value 0
2: value 15
....
16: value 240
17: value 255 so value < 255 is already false

waiting 16 * 58 milliseconds adds up to 928 instead of 1000.
The truncated digits behind the decimal point.

best regards Stefan

then the for loop should run 255 times, and in each loop, it should do a delay of 1000/(255/1) ms.

Yes, as you have seen, that delay becomes 3 milliseconds due to the truncation with the integer math.

On the serial monitor, it displayed 15340 as the number of milliseconds it took to complete 10 on/off cycles with an increment/decrement of 1.

Yes. You see 3*255 = 765 ms *2 for up and down = 1530 ms. For the ten cycles 15340 looks pretty close.

When you change the increment from 1, you don't have 255 delay periods and the integer truncation of the delay value is different.

With an increment/decrement of 5 it took 19387 ms.

You have an increment/decrement of 5. Then 1000/ (255/5) is a delay period of 19 ms using the integer value. To go from 255 to 0 by 5's gives you 51 intervals.
51192 = 1938 ms. Again very close to the value you see.

I still can't understand why all increment values don't cause it to take 20 seconds. Most increment values actually make it take less than 20 seconds.

This is due to the truncation of the integer division in calculating the delay period and the truncation of the number of intervals if your delay period is not a factor of 255.

I take it that you did not notice

bearing in mind that it uses integers for its calculations

in reply #1

StefanL38:
old programmer-wisdom:

the bug has its hands on the keyboard and ist staring at the screen

a microcontroller always does what you have programmed. Though the program does something different than you expected you don’t understand yozur own program (yet)

you just have to caclulate on a pocket-caclulator or smartphone an bear in mind that you are using integer-variables which means the decimals behind the decimal-point get truncated

example:

1000/(255/15) =
as delay uses integers
the delay(time/(255/increament));

does delay(58); // not delay(58.823529412)

your for-loop has the ending-condition value < 255
this means

1: value 0
2: value 15

16: value 240
17: value 255 so value < 255 is already false

waiting 16 * 58 milliseconds adds up to 928 instead of 1000.
The truncated digits behind the decimal point.

best regards Stefan

Ohhh I see thanks! I totally ignored the fact that integer math truncates the decimal. No wonder it didn’t add up to a full second when run.

cattledog:
Yes, as you have seen, that delay becomes 3 milliseconds due to the truncation with the integer math.

Yes. You see 3*255 = 765 ms *2 for up and down = 1530 ms. For the ten cycles 15340 looks pretty close.

When you change the increment from 1, you don’t have 255 delay periods and the integer truncation of the delay value is different.

You have an increment/decrement of 5. Then 1000/ (255/5) is a delay period of 19 ms using the integer value. To go from 255 to 0 by 5’s gives you 51 intervals.
51192 = 1938 ms. Again very close to the value you see.

This is due to the truncation of the integer division in calculating the delay period and the truncation of the number of intervals if your delay period is not a factor of 255.

Yup! I see why now. Even though the math would always bring it to a full 2 seconds, the truncation of the decimal changes things. Thanks!

UKHeliBob:
I take it that you did not noticein reply #1

Yeah. I guess I overlooked that. I didn’t know what using integers for the calculations meant for the code. I understand now that it gets truncated so delay time is actually being sliced off. Thanks for the help!

Thank you everyone for the help! I totally understand what’s going on now.

Now I’m wondering, how would I prevent truncation and how can I make an on/off cycle always take an exact 20 seconds no more or less?

Using the function delayMicroseconds() will offer more accuracy but still not 100%
Because it will be delayMicroseconds(58823) instead delay(58)

even more accuray requires to use calculating differences with function millis()
like described in the tutorials about non-blocking timing using millis()

best regards Stefan

got it, thanks!

StefanL38:
Using the function delayMicroseconds() will offer more accuracy but still not 100%
Because it will be delayMicroseconds(58823) instead delay(58)

even more accuray requires to use calculating differences with function millis()
like described in the tutorials about non-blocking timing using millis()

best regards Stefan