I have a circuit in which the highest frequency of interest is 200Hz, I want to be able to sample this signal at 500Hz (obeying the Nyquist rule).
Currently when I am using analogRead to convert the data to digital, the complete loop in taking around 5-6ms to run without any delay so the ADC speed is quite slow. I have measured this by writing to an led and measuring the time between it is on (start of loop) and off (at the end of the loop).
I am using an Arduino Due since I can set the analogReadResolution to 12 bits. How can I set the sampling rate to 500Hz so that each loop runs accurately at 2ms?
Code is added below for more understanding. It's a fairly simple code as I'm new and not proficient with Arduino and I only need to use it for sampling and digitisation of my signal.
const int analogInPin = A0; // analog input pin that feeds in from circuit
const int threshold = 3500; // a threshold level that's in the range of the analog input
const int ledPin = 13; // pin that the LED is attached to
signed int sensorValue = 0; // value read in after A-D Conversion
float voltage = 0;
// the setup routine runs once when you press reset:
void setup() {
// initialize the LED pin as an output:
pinMode(ledPin, OUTPUT);
// initialize serial communication at 9600 bits per second:
Serial.begin(9600);
// changes resolution from default to 12 bits
analogReadResolution(12);
}
// the loop routine runs over and over again forever:
void loop() {
digitalWrite(3, HIGH);
// read the input on analog pin 0:
sensorValue = analogRead(analogInPin);
// Convert the analog reading (which goes from 0 - 4095) to a voltage (0 - 3.3V):
voltage = sensorValue * (3.3 / 4095.0);
// if value returned by ADC is greater that threshold
// then set the value to 0 and flash LED
if (sensorValue > threshold) {
sensorValue = 0;
voltage = 0;
digitalWrite(ledPin, HIGH);
} else {
digitalWrite(ledPin, LOW);
}
Serial.println(voltage);
digitalWrite(3, LOW);
delay(1);
}
It is the whole loop that took 5-6ms, it could be because I've got a floating point calculation and displaying data to serial. So I wanted to be able to run the loop and for it to take 2ms with minimum error from loop to loop and high accuracy.
Serial.begin(9600); Glacially slow, most of your time is spent in I/O.
Unless you have a reason not to, use 115200 baud.
To sample at 500Hz, this is the sort of loop you need:
#define PERIOD 2000 // period in us
unsigned long last_us = 0L
void loop ()
{
if (micros () - last_us > PERIOD)
{
last_us += PERIOD ;
sample () ;
}
}
void sample ()
{
// do the sampling here
}
Note the avoidance of delay() and millis(), which cannot give accurate
timing for short periods. delayMicroseconds() is not used because you cannot easily compensate for the time
spent in sample() function.
I will preface this with a HUGE... I am brand new to Arduino... just figured out controlling steppers and a few other things for colleagues. One such colleague asked if I could help figure out how to sample from a LDR at a constant rate (1000Hz) and I came across these posts.
... and have made the changes I thought would work, but I must be doing something wrong. I can sample the light level no problem and if I mess about with PERIOD, then the data sampling rate I see in the Serial Monitor changes, but I can't seem to get a sample showing a 1000Hz sampling rate.
Here is the code below. Can you see the error of my ways?
BTW... I think it is awesome that there are so many in the Arduino community that are so free of their time and knowledge to help others. As a professor, I really appreciate others helping those less knowledgeable.
Cheers
Derek #define PERIOD 2000 // period in us
unsigned long last_us = 0L;
const int analogInPin = A0; // Analog pin that the photocell is attached to
int sensorValue = 0; // value read from the photocell
void setup() {
// initialize serial communications at 115200 bps:
Serial.begin(115200);
}
void sample ()
{
// do the sampling here
// read the analog in value:
sensorValue = analogRead(analogInPin);
// print the results to the serial monitor:
Serial.println(sensorValue);
}