Acquiring Data with the Arduino: How do I set a sampling frequency?

Hello,

I've been using an Arduino Uno to acquire data. Therefore, I need to establish a fixed sampling frequency.

I've tried to use 'delay(Sampling_Period_in_Milliseconds)' at the end of each iteration, but so far it hasn't worked very well. I use the function 'millis()' to print out the time difference between each segment of each iteration.

More or less like this:

void loop(){
   to=millis();
    Serial.println(to);
   //Program Written to Acquire Data 
   delay(1) //delay 1 millisecond, which is a frequency rate of 1kHz
}

And from what I can see, it's been taking about 20 milliseconds to perform each iteration, which is waaaay more than what I wanted it to be.

So, I have a couple of questions:

  1. Could it be that 'Serial.print()' is taking a lot of time to be executed and it's interfering with my time measurements?

  2. Is there a list of how long it takes to perform the basic functions on Arduino?

  3. How do you guys usually set the sampling frequency when using the Arduino? Is there a better way than using 'delay(Ts)'? I've heard about interruptions, but I have no idea how it works.

Thanks!

More code. This doesn't have a setup(), where you presumably initiate Serial and establish a bit rate. We can't tell how long it's taking you to print things, which will matter a lot in discussing your questions. We don't see what type variable to is, and there's a semicolon missing on the final statement. This obviously isn't code that you compiled and ran, and it's not even a snippet of something that ran.

In answer to your first question, yes, serial output takes time, and, after you print enough stuff to fill the serial buffer, it blocks further activity until there's room in the buffer. At 9600 bits per second, that will be very noticeable. I mark it at about 7 milliseconds while it's printing 5 digit numbers. It takes roughly a millisecond per character, so that's not unexpectable.

More info. You don't say what you're sampling. Analog? Digital inputs?

Perhaps a better way would be to mark the time, in microseconds, of each sample, and trigger another when 1000 microseconds have passed. You won't get samples right on the clock, but they won't get behind, either.

Does a precise time between samples matter? that answer will affect how you decide to initiate a sample.

Tell more. What's your purpose?

tmd3:
More code. This doesn't have a setup(), where you presumably initiate Serial and establish a bit rate. We can't tell how long it's taking you to print things, which will matter a lot in discussing your questions. We don't see what type variable to is, and there's a semicolon missing on the final statement. This obviously isn't code that you compiled and ran, and it's not even a snippet of something that ran.

In answer to your first question, yes, serial output takes time, and, after you print enough stuff to fill the serial buffer, it blocks further activity until there's room in the buffer. At 9600 bits per second, that will be very noticeable. I mark it at about 7 milliseconds while it's printing 5 digit numbers. It takes roughly a millisecond per character, so that's not unexpectable.

More info. You don't say what you're sampling. Analog? Digital inputs?

Perhaps a better way would be to mark the time, in microseconds, of each sample, and trigger another when 1000 microseconds have passed. You won't get samples right on the clock, but they won't get behind, either.

Does a precise time between samples matter? that answer will affect how you decide to initiate a sample.

Tell more. What's your purpose?

Thank you for your answer.

I'm measuring AC voltage using the Analog input A0.

The basic code is:

int to;
int pinV=0;
float V;
void setup(){
   Serial.begin(9600);
}

void loop(){
  to=millis();
  Serial.println("Time= " + String(to));
  V=5*analogRead(pinV)/1023.0; //in Volts
  Serial.println("V= " + String(V));
  delay(1);
}

Some results:

Time: 1262 (milliseconds)
V= 3.85

Time: 1284
V= 2.61

Time: 1305
V= 1.02

Time: 1328
V= 3.46

As you can see, it's taking around 20~25 milliseconds to perform each iteration, which is a lot more than the 1 millisecond I was hoping for.

The goal of the project is to analyse whole periods of AC voltage, so it is really important to keep the sampling period constant and small.

So, let's say I stop using the 'Serial.println()' function in order for the program to run faster. How can I know how long it's taking for each iteration to be performed?

[/code]

I ran your code, and got this output, in part:

Time= 5826
V= 1.37
Time= 5849
V= 1.52

Calculating how long it takes to print that:
(19 characters) * (10 bits/character) / (9600 bits/second) = .0198 seconds, or, pretty close to 20 milliseconds. Almost all of the processor's time is spent waiting for something to print, to deplete the serial buffer so that new characters can be added.

To see how fast it's really going, you might consider saving the value of micros(), acquiring some number of data points, logging the time at the end of that process, subtracting the start time, and printing the result. That will get you very close.

Note that you won't be able to save a thousand ten-bit ADC readings that you will acquire in one second, because you'd run out of memory. But, you can process them if your purpose doesn't call for knowing the values of individual samples at the end - like, maybe taking the sum of their squares for an RMS calculation, or maybe keeping track of the maximum and minimum for a peak-to-peak measurement.

Note also that variable to is int, while the return value of millis() is unsigned long. If you want to be able to keep track of how long things take after millis() overflows an integer - about 32 or 33 seconds - you'll want to be sure to use an unsigned int, which will get you to about 65 seconds, or unsigned long, which will take you as far as about ten weeks.

You haven't said whether it's important to you to have your samples precisely spaced in time. It might not be important. If it is, though, you'll want to eliminate uncertainty about when the ADC reading is initiatedby autotriggering the ADC with one of the hardware timers, as opposed to letting the sketch manage ADC conversions directly. That's a lot of learning for a guy who's current skill level has him asking this question, but certainly not, I believe, beyond your capabilities.

If precise timing isn't important for you, please take a look at the example program, "Blink without delay," that comes with the IDE. That sketch blinks an LED without using the delay function, by recording the time that the LED last changed state, watching the current time until an appropriate interval has passed, and changing the state again. The method used in that program will generalize to a lot of other timing functions - like this one - and it's a reliable scheme for getting the processor to perform actions at or nearly at known times. It also avoids the use of delay(), which makes the processor wait for an interval to elapse, while doing nothing else. It's customary to call delay() a "blocking" function, because it blocks the preocessor from doing anything other than waiting for the delay to elapse. That may be OK with you today, but you'll need better techniques later, so you will want to go ahead and learn this one.

I note that you're doing a floating point calculation after every sample. Floating point takes a long time, compared to integer calculations. You might be well advised to consider doing your intermediate calculations with integer math - int's or long's, as your data needs require - and then doing the floating point calculations at the end of the sample period. It won't matter much for this application, since you've got a whole millisecond between samples, but it will matter if you do things faster in the future. For the same reason, you'll want to avoid using Strings during the sample period - I don't know how long they actually take, but, intuitively, I'd think they're kind of slow.