I was going on holiday for 10 days, and wanted to put together a simple system to water my plants. To my dismay, I came home to dead plants!
I had hooked up an Arduino Uno to a water pump, and programmed it to turn on for 1 minute every 12 hours using the following code (this is essentially the same as the Blink example).
// the setup function runs once when you press reset or power the board
void setup() {
pinMode(7, OUTPUT); // initialize digital pin 7 as an output. pin 7 connects to the water pump
pinMode(LED_BUILTIN, OUTPUT); // initialize onboard LED as an output.
}
// the loop function runs over and over again forever
void loop() {
digitalWrite(7, HIGH); // set 7 to high.
digitalWrite(LED_BUILTIN, HIGH); // set onboard LED to high
delay( 60000 ); // wait for 1 minute
digitalWrite(7, LOW); // set 7 to low
digitalWrite(LED_BUILTIN, LOW); // set onboard LED to low
delay( 43140000 ); // wait for 11hrs and 59 mins
}
There are other components (e.g. a PSU, a Mosfet, and a water bucket as a reservoir), but I suspect the issue does not lie with them, because 1. before leaving, I tested my setup by using 10 second time intervals in both calls to delay, and it worked as expected; and 2. because when using the code above, the pump also activated as expected on startup and restarts.
However, after the first activation, the pump never activated again (I know this because water bucket reservoir was full when I returned after 10 days).
This leads me to suspect that using a long time in the second call to delay is not the correct way to achieve a 12 hour wait. I did a bit of follow up reading about the disadvantages of using delay, but I couldn't find any explanation as to why it failed completely.
Any help in understanding why this didn't work would be appreciated!
without using a real-time clock, a more conventional approach might be to use millis() to time minutes. during each timer expiration, translate minutes into hour:min and turn on the pump at a specific time and off at a specific time (e.g. the next minute)
If the code is required to do nothing other than turn on an output for a period, wait for a period and repeat forever, then delay() is a perfectly good solution
i certainly believe you are there when developing and testing the code.
during testing you can accelerate the timing so that a minute is incremented every second to see if the events occur when expected.
why are you arguing? first you argue that you disagree with a suggestion. then you argue that it's too complex. now you argue that you're not there while testing? ???
I am not sure about it, it's not that I had a way to test it. however I am sure that it is better not to rely on overflows, I would try to avoid it as much as possible. micros overflows 1000 times more than millis() who knows what one of the overflows could do.
Note that I am not sure about it, but since there is definitely a problem with a program which looks fine, I think that something could malfunction internally.