Storing data and it's RTC

I'm using MKR Zero to read data from sensors, I want to store this data with the exact time the sensor did the measurement.

I thought about using three different arrays, one for data , one for RTC(when the measurement happend) and the third one the type of the sensor.

I'm just wondering if it is possible to store the exact time the measurement happened.

Your post was MOVED to its current location as it is more suitable.

Or a single array of struct.

do you know the exact time?

you can store the value of millis() before and after calling the function that reads the sensor and take the average (if reading is long)

you would be better off with an array of structures rather than 3 arrays, this way all the info about 1 measure is "centralised"

enum t_sensorType : uint8_t {unknownSensor, temperatureSensor, humiditySensor, pressureSensor}; 

struct t_measure {
  t_sensorType type;
  uint32_t     measureTime;
  int32_t      measureValue;
};

const size_t maxMeasures = 30;
t_measure myMeasures[maxMeasures];

What is the precision needed?

Search for DATALOG and your board.
There should be lots of tutorials.

How often do you want to measure?

Every 30 mins, so my I'm not looking for great precision (1 - 5 mins is okay)

An RTC offers more accuracy than you need, but is so simple and cheap.
bildr » Do You Have The Time? DS1307 RT Clock + Arduino
Just don't bother printing the seconds. The DS3231 is about the same price, and uses the same code unchanged.

1 Like

Yes in that case an RTC will meet your needs. Store the time in seconds will just eat up 4 bytes

I have done amps for power and temperature for weather.

for the weather, I would take a temperature reading every 10 seconds and keep track of

  • the highest temperature
  • the lowest temperature
  • the average temperature
    then, on the hour I would write to my datalog
    date, hour, highest, lowest, average
    then re-set the values and start again.

it was neat to watch the three graphs in the morning as the highs would increase.
in the end, all I needed was the average.

when doing power, the clamp-on CT would monitor apparent current continuously and for that, I was looking for peaks and dips so the number of highs above a threshold as well as the 3 peak highs for the hour.

that seems overkill but if you don't run out of a battery I assume it's OK

USB power in my front yard, so battery was not used.

the idea was to test the sensor, the volume of data and the robustness of my wifi connection.
in the end GoogleSheets was totally unuseable for large amounts of data as it took way to long to load.

I did use 50 degrees F as my base, and for the array, only uses degrees over or under. kept it all as ints.
the minute average was stored in a different array and the first one reset.
for the running math, I did reading x10 and saved as int.
for the data log, it was /10 as I did not see the need for more than 1 decimal place.

I guess I could have just used delay() so as to not waste CPU cycles. LOL

OK (a packet every 10s then is not so much stress for the Wi-Fi :slight_smile: )

I may have not explained it fully. trying not to overwhelm the OT.

reading every 10 seconds,
50 degrees - new reading
x10, save to an array as an byte

after 60 seconds, sum array, divided by 6
saved to second array.

after one hour,

  • summed array, (divided by 60 )x10
  • calculated with 1 decimal
  • uploaded to google sheets as float.

hourly reading was
date/hour high, low, average

no need for minutes and using 1 decimal stripped away the unneeded decimal places.
using the running base kept readings as a byte
there was never a 25 degree change in an hour, but there were well over 25 degree changes in a day.

the idea was to get annual last frost and first frost days and to get highest summer temperatures.
it got more involved, but for the purpose of this thread, the OP may want to think about what happens over the whole hour.

OK, while I am discussing.....
each hour left an average temperature. that was new base temp for the next hour. trying to keep all the readings as small as possible to save as a byte.

now the deep end of the pool.
the minute readings for hour was also converted to
minutes between 30F and 40F
minutes between 40F and 50F
minutes between 50F and 60F
minutes betw..... you get the idea.

31degF is not harmful if it only lasts a minute. a cool breeze.
31F for 10 minutes is bad, for 2 hours is killer.
so knowing the number of minutes per day the temperature was at some range was part of the study.

that might help to explain why google sheets filled up as much as it did.
day/hour, max, min, avg, 0,10,20,30,40,50,60,70,80,90,100,101,102,103,104,105,016,107,108,109,110

since we rarely get over 100f the max high and time over 100 had a finer resolution.
and anything under 30 was considered frost so how much under 30 was not as important as how long under 30

Understood

This topic was automatically closed 120 days after the last reply. New replies are no longer allowed.