I need help with timing between digitalWrites with out using delays. I have searched and read but cant quite figure it out. I put comments in the code where I need Help.
Please can someone can point me in the right direction. Thanks!
const int buttonPin = 2;
const int ledPin = 13;
const int ledPin1 = 4;
//
int buttonState = 0;
void setup() {
pinMode(ledPin, OUTPUT);
pinMode(buttonPin, INPUT);
pinMode(ledPin1, OUTPUT);
}
void loop() {
buttonState = digitalRead(buttonPin);
if (buttonState == HIGH) {
digitalWrite(ledPin1, HIGH);
// wait .5 seconds then
digitalWrite(ledPin, HIGH);
}
else {
digitalWrite(ledPin, LOW);
//wait 4 seconds then
digitalWrite(ledPin1, LOW);
}
}
Look at the 'blink without delay' example in the Learning pages of this site or in the IDE.
Once you understand how that works, you will understand how to use it in your own programs.
classof1980:
I need help with timing between digitalWrites with out using delays. I have searched and read but cant quite figure it out. I put comments in the code where I need Help.
To delay without using delays, you need to basically do this (imagine a 10 second delay):
(A) Get the current time in milliseconds (call it "snapshot")
(B) Is the current time less than "snapshot + 10000"?
(C) If YES, go back to B, if NO, 10 seconds have elapsed.
Now let's do it in real code. We will make an LED blink on and off (1/2 second on and 1/2 second off). 1/2 second is 500 milliseconds.
volatile uint32_t st; // st is "start time"
void setup (void)
{
pinMode (13, OUTPUT); // use the board LED as an indicator
while (1) { // do it forever
digitalWrite (13, HIGH); // turn on the LED
st = millis (); // snapshot the current time
while (millis () > (st + 500)); // loop here until actual time > start + 500
digitalWrite (13, LOW); // LED off
st = millis (); // new snapshot
while (millis () > (st + 500)); // same as before, wait for start + 500
}
}
void loop (void)
{
// do nothing
}
See what this does? Now granted you are not "gaining" anything by simply hanging in a loop waiting for "ts+500" over plain old delay(), but a more complicated program could be DOING SOMETHING ELSE inside that waiting loop whereas if you use delay() you can't do anything else. Get it?