Measuring rainfall [Solved] [For now!]

I have a Sparkfun weather meter.

I started to suspect that the amount of rainfall the rain gauge was reporting was lower than it should be, so I put a tray outside in the rain and waited for it to fill up with water. When the rain stopped I measured the depth of the water and compared it to the rainfall as measured by the rain gauge. There is a rough discrepancy of 50%; as in the rain gauge reports half as much rainfall as I measure in the tray. I have repeated this several times.

I've not finished checking things like silly software mistakes (50% is very suspicious) and I don't need help with that. What I would like to know is:

Is the apparently obvious method of putting a tray out in the rain and then measuring the depth of water actually valid?

If you have experience of that particular rain gauge did you validate its accuracy by any means and of so do you have any comments?

Thanks.

I don't know how the gauge operates, but perhaps calibrate it using a known volume of water poured into the thing.
Paul

Paul_KD7HB:
I don't know how the gauge operates, but perhaps calibrate it using a known volume of water poured into the thing.
Paul

There is a see-saw arrangement with a little bucket on each side. In the top there is a rectangular funnel to collect the rain, which goes through a small hole to the see-saw thing. The water collects into whichever side is raised up until there is enough water to tip that side down. When it tips the other side fills up and so it repeats. As it tips it moves a magnet past a reed switch to create a brief pulse.

The usual rain gauge has an identified catchment area - usually a circle with sharp edge. The vessel beneath may have a smaller diameter or be conical so that the depth would be multiplied for easier reading. The vessel would be calibrated to allow for the effects of the shape. Hope that makes sense.

Any vessel with an identifiable entry surface area would do. It may be more accurate to weigh the water rather than measure its depth.

...R

The description says it is a "tipping bucket" gauge. Usually this is implemented by a magnet and reed switch, so you get one pulse each time the bucket tips (according to the docs, for every 0.011 inch of rain).

If you can verify that no pulses are missed, then a fix would be to change the additive constant in the code to reflect reality. All sensors can benefit from calibration!

Given the small area of the capture opening on that system, I would expect droplets to adhere to the sizes sides and evaporate rapidly under conditions of light drizzle or intermittent showers, so the gauge would tend to underestimate the true rainfall.

Is the apparently obvious method of putting a tray out in the rain and then measuring the depth of water actually valid?

I use a straight sided bucket with a ruler, and that agrees pretty well with the professional reports from the local airport . We are on a rainwater system for everything, so I have to keep track.

Robin2:
The usual rain gauge has an identified catchment area - usually a circle with sharp edge. The vessel beneath may have a smaller diameter or be conical so that the depth would be multiplied for easier reading. The vessel would be calibrated to allow for the effects of the shape. Hope that makes sense.

Any vessel with an identifiable entry surface area would do. It may be more accurate to weigh the water rather than measure its depth.

...R

I agree that weighing would be more accurate, but with a 50% discrepancy I am not too concerned about accuracy at the moment. Yes, that all makes sense, thanks.

jremington:
The description says it is a "tipping bucket" gauge. Usually this is implemented by a magnet and reed switch, so you get one pulse each time the bucket tips (according to the docs, for every 0.011 inch of rain).

If you can verify that no pulses are missed, then a fix would be to change the additive constant in the code to reflect reality. All sensors can benefit from calibration!

Given the small area of the capture opening on that system, I would expect droplets to adhere to the sizes and evaporate rapidly under conditions of light drizzle or intermittent showers, so the gauge would tend to underestimate the true rainfall.
I use a straight sided bucket with a ruler, and that agrees pretty well with the professional reports from the local airport . We are on a rainwater system for everything, so I have to keep track.

The document I have says 0.2794mm per pulse, which agrees with the 0.011 inch you have.

I have only checked this in a storm, like 10 to 20mm of rain in a few hours, I don't think light rain evaporating off the sides is the problem :slight_smile:
(Although a wet cat coming in and leaving muddy paw-prints everywhere is another story :o )

Your bucket and ruler method seems to validate what I have tried, thanks.

The gauge is mounted on a pole on the end of the house (not the one shown in the documentation, it's too wobbly) with a D1 Mini in the loft to relay the data over Wi-Fi. I'm trying to avoid taking it down.

More investigation needed, thanks for suggestions.

There's more rain heading my way UK rainfall radar map - Met Office

Time for more tests.

This may or may not have any relevance to your question. My rain gauge was given to me several years ago as volunteer weather reporting station. It is about 10 inches long and wedge shaped so tiny amounts of rain can be read at the bottom of the gauge. And transparent plastic.
This morning I checked the level and then shook down all the drops still on the side of the thing and got 0.01 inch more of water from the droplets.
So, I wonder if your device is holding some water on the plastic surface and not letting go of it when it dumps That could add up over time, if the held water evaporates between operations.
Paul

Hi Paul,

I've been making the measurements in heavy rain, I have a metal tray with (almost) vertical sides that at 20mm deep holds >800ml of water (I spilled some while checking), so the discrepancies I am talking about are way more than a few drops of rain stuck to the sides. We've had a few rain storms recently, which has enabled me to get 10 to 20mm of rain in a few hours.

I'm getting the raw number of pulses and multiplying by the calibration factor and getting roughly half what is in the tray.

My 2 lines of thought at the moment are:

The debounce time on the pulse count input was too long and I was missing some pulses as a result, it is now set to 1ms. As the thing is stuck up a pole outside making it difficult to manually trigger it I am leaving until another time to find out just how long the pulses really are.

I am now sending the actual state of the reed switch as well as the count in my data so I can see if there's anything odd there, such as it getting stuck.

Watch this space.

Thanks.

If the wind was blowing the rain, the effective capture area of the sensor will be smaller.
Paul

Interesting point. Surely that applies to my tray as well though?

And I keep coming back to 50%.

Suspicious that it's 50% and also the device toggles...

Me too, hence the second Idea in reply #8.

Just use interrupts... just kidding.

[color=#222222][/color]
long lastDebounceTime = 0;  // the last time the output pin was toggled[color=#222222][/color]
long debounceDelay = 50;    // the debounce time; increase if the output flickers[color=#222222][/color]
[color=#222222][/color]
void loop() {[color=#222222][/color]
  if(Debounce(buttonPin) == HIGH){[color=#222222][/color]
    count1++;

Try adding a debounce to your switch circuit , I think you may be getting an error in reporting. If you are reading more than what you suspect it should be then this may help. Pretty straight forward, if your catch will hold 1/4" of water and it dumped 8 times your rainfall should be around 2" .For debugging add a counter for your switch and see if it agrees with your output . Hope this helps in some way

Hi Larry,

I have debounce.

The error is the other way around, the rainfall as measured by a tray left outside on my wall is twice what the rain gauge reports. When I say 'twice' I mean as best I can measure using a ruler to measure the depth of water in the tray. If I measure 20mm then the gauge shows around 10mm.

I'm not getting extra counts, I'm not getting enough.

aarg:
Just use interrupts... just kidding.

++Karma; // ROFLAMO ! ! !

(Regular readers will know why that's funny :slight_smile: )

Humor us. Show the code....

SteveMann:
Humour us. Show the code....

Well, I could but I'm not asking for help with code :slight_smile:

It's in 4 files....

The first I would check, double check and maybe even triple check is that not somewhere secretly inches are not converted to millimeters where they should be, or the other way around. That's a factor 2.5, pretty close to the factor 2 error you see...

If the code is in four pages, that calls for trimming to the smallest example that still shows the problem. For a rain meter that would not be more than a few lines of code. If anything it rules out code errors.