Go Down

Topic: delay using empty FOR loop?? (Read 2009 times) previous topic - next topic


I am not able to create a delay using an empty FOR loop with the Duemilanove. While loop works fine. I verified this using a scope. Changing the constant values from 1 to 1000000 showed no significant change.
NOTE: If I added the simple "k=i;", to each loop, still no success. If I added readDigital(), then it does work. Has anyone had similar problems?

* Blink using while loop for delay
* The basic Arduino example.  Turns on an LED on for one second,
* then off for one second, and so on...  We use pin 13 because,
* depending on your Arduino board, it has either a built-in LED
* or a built-in resistor so that you need only an LED.

int ledPin = 13;                // LED connected to digital pin 13

void setup()                    // run once, when the sketch starts
 pinMode(ledPin, OUTPUT);      // sets the digital pin as output

void loop()                     // run over and over again
//  int i,j;
 #define HMILS  1
 #define HMICS  1
 #define LMILS  1000
 #define LMICS  1000
 for (;;)
   digitalWrite(ledPin, HIGH);   // sets the LED on
   //delayMicroseconds(1);                  // waits for a second
   wait (HMILS,HMICS);
   digitalWrite(ledPin, LOW);    // sets the LED off
   //delayMicroseconds(1000);                  // waits for a second
   wait (LMILS,LMICS);
void wait (long mill, long mics)
 long i,j; //,k;
 //int k;
 while (i < mill) //for (i=0;i<mill;++i)
    //k= i;//digitalRead(ledPin);
    while (j < mics) //for (j=0;j<mics;++j)
       //k= j; //digitalRead(ledPin);



I'm just guessing but it is possible that the compiler is analyzing your code and recognizes that there is nothing happening inside your for loop. If so it might optimize the whole loop away.

I've never looked at the code output by the AVR compiler so I have no idea how much optimization is implemented by the compiler.


Quite a bit.  And trying to trick the compiler to avoid optimizations can be inconsistent, frustrating and futile.

Two choices:  call delayMicroseconds() (recommended), or implement your own delay with inline assembler instructions.


Optimization hmmm... Seems to be a pretty smart compiler. I hope it doesn't take too much liberty in second guessing me. Being a newbie, I tried this before I realized there was a DelayMicoseconds which is quite sufficient for my needs. I was just worried that I might have uncovered a bug.
Thanks for your insight.

Go Up