Exact time of analog input data acquisition point with system clock

Hello

I am working on sensor data acquisition with Mega.

I have two different system to get the sensor data, Arduino mega and company provided software for the sensor.

The company provided software takes the data every second.

I just did basic loop for analog input and set the delay 1 sec for Mega.

The problem is that, at the beginning the data is acquired at the same time by the two system

but as time goes, it seems like there is a huge delay for Arduion Mega.

So that the data taken by Mega is completed shifted not matching with the data taken by company provided software.

I attached the picture of the graph.

Blue is data from the software and orange one is from Arduino…

So I kinda want to see the exact moment when Arduino takes the data…

Like using reference clock and see when Arduino takes the data when I set the delay 1 sec.

Is it possible or way to do it?

Can I use arduino 16MHz system clock and find out exact moment or time when the data is taken?

I am not sure I explained properly but basically I want to know why there is a huge shift on the graph.

Please help me with this problem, any advice would be great.

Thank you

Using just an Arduino, there is no way to know "the exact time". For that matter, there may be no way for the company-provided software to know "the exact time" either.

Both would have to be synchronized to an external time reference. One way is to use a GPS unit with a 1 pulse-per-second output as a signal to begin data acquisition.

In your current situation, you could figure out a scale factor and offset that best superimposes the two waveforms. I would use a PC for that.

Showing you code may help because if your using delay() you may get out of step as nothing is happening during delay time. Try reading current MCU time, then read sensor and send the data (via serial?) and then go into a loop reading the MCU time until a second has expired and then start all over again. You should hopefully not lose time so quickly.

jremington: Using just an Arduino, there is no way to know "the exact time". For that matter, there may be no way for the company-provided software to know "the exact time" either.

Both would have to be synchronized to an external time reference. One way is to use a GPS unit with a 1 pulse-per-second output as a signal to begin data acquisition.

In your current situation, you could figure out a scale factor and offset that best superimposes the two waveforms. I would use a PC for that.

Sorry I have been busy with other work and I have checked this now.

Thanks for your advice, but what about using micro or millis?

This would give us the time whenever it takes the data... but I am worrying since micros and mills

also the commands included in the loop, so that I anyway will not give us the correct time. Right?

Riva:
Showing you code may help because if your using delay() you may get out of step as nothing is happening during delay time. Try reading current MCU time, then read sensor and send the data (via serial?) and then go into a loop reading the MCU time until a second has expired and then start all over again. You should hopefully not lose time so quickly.

My code is way too simple to put on here.

It is just one analogread inside loop and delay(1000). That is it.

Sensor data keeps coming out from the sensor and I take the data with one of analog input pins.

I do not quite understand what you told me to do

“Try reading current MCU time, then read sensor and send the data (via serial?) and then go into a loop reading the MCU time until a second has expired and then start all over again”

This sentence… can you please give me more explanation? Thanks

If your code is that simple then probably just slightly adjusting the delay would have done till the traces match better but they will still drift over time. Maybe find a way to sync the mega reading with the PC software reading (you would need to tell us more about the sensor and it’s setup).

I was thinking of something like below, expecting you was doing several other calculations during the loop time. The code should wait the correct amount of time no matter what extra code you put in as long as it will complete within the loopDelay time.

#define loopDelay 1000UL  // Loop delay time (may need tweaking to match MCU resonator speed a bit better)
unsigned long snapTime;

void setup(){
  Serial.begin(115200);
}
  
void loop(){
  while( millis() < (snapTime + loopDelay)){};  // Wait till time is up
  snapTime = millis();                          // Get current time
  int sensor = analogRead(A0);
  Serial.println(sensor);
}

On (re-)entry to loop the code waits till the expected time since last loop time has expired then falls through to noting current time, reading the analogue value and sending it. It then falls out of the loop where arduino core does possible housekeeping like reading serial etc and then re-enters the loop at the start. Because the arduino (probably) has a resonator the timing of the loop may be a bit off so tweaking the loopDelay value may be needed.

No, the correct way to perform a regular task is this:

#define DELAY 1000

unsigned long timestamp = 0L ;
void loop()
{
  if (millis () - timestamp >= DELAY)
  {
    timestamp += DELAY ;   // You must step the target time EXACTLY 1000 ms
                           // not re-read millis () which could have changed.
    do_my_thing () ;  // even if this sometimes takes > 1 second the system recovers true time
  }
}

Then all you have to worry about is whether the board uses a crystal (accurate) or ceramic resonator (inaccurate) for its system clock.