Arduino as a PC-based DAQ: Best practices

Hi Folks!

I used to log my hobby experiments using Arduino + SD Card shield. I have also used Python to read USB port and store data in CSV file. It was pretty easy, because the timing was not an issue at low sampling rates.

Now I wonder how to reliably measure at significantly higher data rates using Arduino, e.g. acceleration above 500 sps. Because it requires precise sampling rate and controlled jitter. My ultimate goal is to utilize the full power of industrial SPI sensors and 24-bit external analog SPI interface.

I would like to know:

  • How can I precisely control the sampling frequency, and how can I make sure that the set frequency does not wander? (I use timer interrupts).
  • How can I minimize the timing jitter? (Calculating the time delta between samples gives me erratic numbers).
  • How the time is recorded? (Timestamp of OS is not precise enough).
  • How it is done for commercial PC-based measurement systems? (e.g. National Instruments USB DAQ devices).

The question is somewhat broad, but any hint would be helpful.
Thanks!

What is the fysical fenomena You are measuring? It's very easy to ask for fast sampling rates that might not Always be needed. The "cost", Money, work, increases rapidly when demands are getting higher. Don't overdo things is what I think of.

nebulae:
...Now I wonder how to reliably measure at significantly higher data rates using Arduino, e.g. acceleration above 500 sps. Because it requires precise sampling rate and controlled jitter. My ultimate goal is to utilize the full power of industrial SPI sensors and 24-bit external analog SPI interface...

Without having researched the topic, I'd be surprised if there exist accelerometers with analog output and 24 bit resolution. In system design understanding what performance is necessary and possible are key.

How can I precisely control the sampling frequency, and how can I make sure that the set frequency does not wander? (I use timer interrupts).

Use a ADC in a free running mode, then the sampling frequency stability is largely a function of the reference clock. The Atmega328 has such a mode as do any number of SPI ADCs.

How can I minimize the timing jitter? (Calculating the time delta between samples gives me erratic numbers).

ditto

How the time is recorded? (Timestamp of OS is not precise enough).

If you're running through a microcontroller, it would be easy enough to apply a timestamp from a primary reference such as the 1 pulse per second from a suitable GPS unit. That gets you to something on the order of 1 millisecond absolute time. One can do much better with a precision time/frequency reference that does averaging, but it's doubtful that level of accuracy is necessary for most applications. It's also important to distinguish between the need for absolute time accuracy and relative accuracy. Relative time (e.g. event B happened 123.456 milliseconds after event A) is sufficient for many (most?) instrumentation applications and doesn't require a primary reference.

How it is done for commercial PC-based measurement systems? (e.g. National Instruments USB DAQ devices).

See above

How can I precisely control the sampling frequency, and how can I make sure that the set frequency does not wander? (I use timer interrupts).

Timer interrupts are one way. If you already use them, what's not clear there?

How can I minimize the timing jitter? (Calculating the time delta between samples gives me erratic numbers).

What jitter do you get? Using timer interrupts the jitter should be minimal. If you get erratic numbers I guess your code is wrong. As you failed to post your code it's up to you to find the error.

How the time is recorded? (Timestamp of OS is not precise enough).

My OS provides nanosecond time stamps. Do you really need more precision? What precision do you need?

How it is done for commercial PC-based measurement systems? (e.g. National Instruments USB DAQ devices).

"it" = time stamps? I guess they use a timer in the external measurement device and log the OS time stamp. But of course it's possible that they include an RTC in the external device and transmit the measurement with time stamps.

1 Like

Railroader:
What is the fysical fenomena You are measuring? It's very easy to ask for fast sampling rates that might not Always be needed. The "cost", Money, work, increases rapidly when demands are getting higher. Don't overdo things is what I think of.

I want to measure vibrations, but not only. For X Hz vibration at least 2*X sampling rate is required. Money is not a huge problem, as I already have the tools and parts required. Used commercial DAQ would be cheaper, but cost and work is not important. I just want to learn more.

MrMark:
Without having researched the topic, I'd be surprised if there exist accelerometers with analog output and 24 bit resolution. In system design understanding what performance is necessary and possible are key.

Analog sensors do exist: https://www.pcb.com/
Digital sensors too: Silicon Sensing | MEMS Inertial Sensor & IMU Manufacturers

24 bit is my ADC board.

MrMark:
Use a ADC in a free running mode, then the sampling frequency stability is largely a function of the reference clock. The Atmega328 has such a mode as do any number of SPI ADCs.

I'll investigate the "free running mode", but how about SPI sensors? And do you mean commercial DAQs have more sophisticated crystals?

pylon:
Timer interrupts are one way. If you already use them, what's not clear there?
What jitter do you get? Using timer interrupts the jitter should be minimal. If you get erratic numbers I guess your code is wrong. As you failed to post your code it's up to you to find the error.

Once I made a vibration measurement at 1000 sps, using Arduino and commercial DAQ, frequency measurements showed significant difference. FFT and histogram is the reference for comparison. I tried calculating the sampling times, it was erratic. Don't have the code anymore, because I gave up with it. Now I want to understand how I could possibly minimize it.

pylon:
My OS provides nanosecond time stamps. Do you really need more precision? What precision do you need?

Yes, OS can provide nanosecond timestamp, it was what I've been using with my Python script. But does it mean we should refere to that? It could work for RTOS. But I am using two separate systems. Arduino is not synchronized with OS. Maybe I'm wrong, but I don't thing commercial DAQs function like that.

pylon:
"it" = time stamps? I guess they use a timer in the external measurement device and log the OS time stamp. But of course it's possible that they include an RTC in the external device and transmit the measurement with time stamps.

I mean commercial PC-based DAQ. I would like to know, what the common practices are, e.g. most important things to consider. Of-course, not going too deep in detail, because it isn't an easy job to explain the matter which is done by engineers who has degree in a field and many years of experience.

I'm wrong, but I don't thing commercial DAQs function like that.

I guess commercial DAQs have a constant sampling frequency and they simply add the time stamp as if the sampling frequency is actually constant, ignoring eventual transmission delays.

I tried calculating the sampling times, it was erratic.

Maybe the calculation was wrong?

Don't have the code anymore, because I gave up with it.

That means we cannot rule out that an error in your code was responsible for the problem.

1 Like