I may have not explained it fully. trying not to overwhelm the OT.
reading every 10 seconds,
50 degrees - new reading
x10, save to an array as an byte
after 60 seconds, sum array, divided by 6
saved to second array.
after one hour,
- summed array, (divided by 60 )x10
- calculated with 1 decimal
- uploaded to google sheets as float.
hourly reading was
date/hour high, low, average
no need for minutes and using 1 decimal stripped away the unneeded decimal places.
using the running base kept readings as a byte
there was never a 25 degree change in an hour, but there were well over 25 degree changes in a day.
the idea was to get annual last frost and first frost days and to get highest summer temperatures.
it got more involved, but for the purpose of this thread, the OP may want to think about what happens over the whole hour.
OK, while I am discussing.....
each hour left an average temperature. that was new base temp for the next hour. trying to keep all the readings as small as possible to save as a byte.
now the deep end of the pool.
the minute readings for hour was also converted to
minutes between 30F and 40F
minutes between 40F and 50F
minutes between 50F and 60F
minutes betw..... you get the idea.
31degF is not harmful if it only lasts a minute. a cool breeze.
31F for 10 minutes is bad, for 2 hours is killer.
so knowing the number of minutes per day the temperature was at some range was part of the study.
that might help to explain why google sheets filled up as much as it did.
day/hour, max, min, avg, 0,10,20,30,40,50,60,70,80,90,100,101,102,103,104,105,016,107,108,109,110
since we rarely get over 100f the max high and time over 100 had a finer resolution.
and anything under 30 was considered frost so how much under 30 was not as important as how long under 30