Please help troubleshoot my plant waterer

Hello,

I was going on holiday for 10 days, and wanted to put together a simple system to water my plants. To my dismay, I came home to dead plants!

I had hooked up an Arduino Uno to a water pump, and programmed it to turn on for 1 minute every 12 hours using the following code (this is essentially the same as the Blink example).

// the setup function runs once when you press reset or power the board
void setup() {
  pinMode(7, OUTPUT); // initialize digital pin 7 as an output. pin 7 connects to the water pump
  pinMode(LED_BUILTIN, OUTPUT); // initialize onboard LED as an output.
}

// the loop function runs over and over again forever
void loop() {
  
  digitalWrite(7, HIGH);   // set 7 to high. 
  digitalWrite(LED_BUILTIN, HIGH);   // set onboard LED to high
  delay( 60000 );          // wait for 1 minute
  
  digitalWrite(7, LOW);    // set 7 to low
  digitalWrite(LED_BUILTIN, LOW);   // set onboard LED to low
  delay( 43140000 );       // wait for 11hrs and 59 mins
  
}

There are other components (e.g. a PSU, a Mosfet, and a water bucket as a reservoir), but I suspect the issue does not lie with them, because 1. before leaving, I tested my setup by using 10 second time intervals in both calls to delay, and it worked as expected; and 2. because when using the code above, the pump also activated as expected on startup and restarts.

However, after the first activation, the pump never activated again (I know this because water bucket reservoir was full when I returned after 10 days).

This leads me to suspect that using a long time in the second call to delay is not the correct way to achieve a 12 hour wait. I did a bit of follow up reading about the disadvantages of using delay, but I couldn't find any explanation as to why it failed completely.

Any help in understanding why this didn't work would be appreciated!

Larry

1 Like

Welcome to the forum

Your topic was MOVED to its current forum category as it is more suitable than the original

It’s probably worth showing us how it’s all wired together with a schematic and clear photos.

Hi Larry

Your problem can be connected to long delays, read how to make your program safe from time counters overflowing:

https://www.norwegiancreations.com/2018/10/arduino-tutorial-avoiding-the-overflow-issue-when-using-millis-and-micros/#:~:text=Here%20we%20will%20get%20a,known%20as%20overflow%20or%20rollover.&text=We%20basically%20just%20move%20time_now,side%20of%20the%20inequality%20operator

Delay should work for 49 days before overflow.

@kronerson whatever the problem is it is not caused by variables overflowing and miilis() is not even being used

without using a real-time clock, a more conventional approach might be to use millis() to time minutes. during each timer expiration, translate minutes into hour:min and turn on the pump at a specific time and off at a specific time (e.g. the next minute)

I beg to differ

If the code is required to do nothing other than turn on an output for a period, wait for a period and repeat forever, then delay() is a perfectly good solution

The problem is likely somewhere in the detail we don’t have.

Agreed

sound like you're arguing that it's not a valid approach, just one of many, including the use of delay().

i'm not suggesting use of delay() is not a valid solution. just suggesting another

Using millis() for timing is a perfectly good way of doing it, but for this application I would suggest that it is not the more conventional approach

Why introduce complexity when it is not needed ?

1 Like

not complexity, it adds testability.

this approach can report the intermediate events, each minute tic, without needing to wait for an event in 12 hours

Agreed, but only at the expense of more complexity for a project that does not need such functionality

What is the point of reporting intermediate events when you are not there to see them ?

i certainly believe you are there when developing and testing the code.

during testing you can accelerate the timing so that a minute is incremented every second to see if the events occur when expected.

why are you arguing? first you argue that you disagree with a suggestion. then you argue that it's too complex. now you argue that you're not there while testing? ???

I am not arguing about the use of millis() in general

I contend that

  • using delay() is perfectly valid for this project
  • millis() is more complicated to use than delay() for timing for this project
  • accelerated testing using millis() has nothing to offer for this project

What happens if you run it for two minutes on, two minutes off?

I would be inclined to put a UL suffix on your delay times as a precaution too.

not sure about that, the definition of delay() uses the micros() function

void delay(unsigned long ms)
{
	uint32_t start = micros();

	while (ms > 0) {
		yield();
		while ( ms > 0 && (micros() - start) >= 1000) {
			ms--;
			start += 1000;
		}
	}
}

Are you saying that delay() cannot be relied upon because micros() overflows ?

Note that subtraction is used in determining the end of the period and this works even after a rollover

I am not sure about it, it's not that I had a way to test it. however I am sure that it is better not to rely on overflows, I would try to avoid it as much as possible. micros overflows 1000 times more than millis() who knows what one of the overflows could do.

Note that I am not sure about it, but since there is definitely a problem with a program which looks fine, I think that something could malfunction internally.