Question about working with millis()

Hi all,

I am sending an analogue sensor data from a UNO to my PC through serial. Since the sensor value is a little unstable on every read without any delays, I thought I can send an averaged number every 10 reads or so with a delay. Here is how I did it.

void loop() {
  if (millis() - startTime > 100) {
    // read the value from the sensor:
    sensorValue += analogRead(sensorPin);
    Serial.println(timeChecked);
    //Serial.print("|");
    //Serial.print(sensorValue);
    //Serial.print("|");
    //sensorValue = sensorValue/(timeChecked+1); 
    //Serial.println(sensorValue);
    startTime = millis();
    timeChecked = 0;
    sensorValue = 0;
  } else {
    //When it is not 100 milliseconds yet, do this:
    sensorValue += analogRead(sensorPin);
    timeChecked += 1;
    delay(10);
  }
}

The interesting part is, it works for the first 30 seconds or so. with delay(10), timeChecked prints 10 - 11 every 100 milliseconds. And then, all of a sudden, timeChecked prints 0 and stay like for the rest of the runtime. I have tried other delays, didn't seem to make a difference.

Did I write the logic wrong? Not in the formates Arduino likes? Please help~

Please post your code.

You know that stuff at the top of the code that must be there or the compiler complains? It is necessary for us to see that part to help debug your code.

I suspect that you have defined startTime as an int The largest value that can be stored in an int is 32767. So your device will malfunction after 32.767 seconds.

Additionally, is sensorValue a type which is big enough to contain 10 times the maximum analog reading?

Sorry, first time posting things here, like this? I don't have any includes

int sensorPin = A0;    // select the input pin for the potentiometer
int buttonPin = 2;
int ledPin = 13;      // select the pin for the LED
int sensorValue = 0;  // variable to store the value coming from the sensor
int startTime = 0;
int timeChecked = 0;

void setup() {
  Serial.begin(9600);
  // declare the ledPin as an OUTPUT:
  pinMode(ledPin, OUTPUT);
  pinMode(buttonPin, INPUT_PULLUP);

  startTime = millis();
}

void loop() {
  // read the value from the sensor:
  if (millis() - startTime > 100) {
    sensorValue += analogRead(sensorPin);
    //Serial.println(timeChecked);
    //Serial.print("|");
    //Serial.print(sensorValue);
    //Serial.print("|");
    //Serial.println(sensorValue/(timeChecked+1));
    //sensorValue = sensorValue/(timeChecked+1);
    //Serial.println(sensorValue);
    startTime = millis();
    timeChecked = 0;
    sensorValue = 0;
  } else {
    sensorValue += analogRead(sensorPin);
    timeChecked += 1;
    Serial.println(timeChecked);
    delay(10);
  }
}

MorganS:
You know that stuff at the top of the code that must be there or the compiler complains? It is necessary for us to see that part to help debug your code.

I suspect that you have defined startTime as an int The largest value that can be stored in an int is 32767. So your device will malfunction after 32.767 seconds.

Additionally, is sensorValue a type which is big enough to contain 10 times the maximum analog reading?

Thanks, Morgan, that's it!!

I had run into the second problem before where sensorValue turned into minus, did not think of the 32.767 seconds part, thanks so much!

That does not report the running average of 10 readings. To do that, you need an array of size 10 to hold all the readings, plus the running sum. When you read a value, you have to subtract the oldest value and add in the newest value.

Slightly more consistent timing:

  if (millis() - startTime >= 100) {
    sensorValue += analogRead(sensorPin);
    startTime += 100;  // Set to when it SHOULD have happened, rather than when it DID happen.
    timeChecked = 0;
    sensorValue = 0;
  }

If, for some reason, the loop is a millisecond late and happens at xxxxx01 instead of xxxxx00 then by adding 100 instead of taking the current time (xxxxx01) your timer won't drift.

startTime += 100;  // Set to when it SHOULD have happened, rather than when it DID happen.

I was originally impressed by the logic of that but I got bitten by it more than once and now I just use

startTime = millis()

unless the small errors that may result really are important.

The problem I had was that if time had accumulated since startTime was first set it took several call to this piece of code before startTime actually exceeded millis().

...R

startTime += interval; is good if you must catch up missing samples. There are very few situations that require this, but they do exist. In that case set the initial startTime at the end of setup() or detect the first iteration with if(startTime==0)

If you want to skip missed samples (the usual case) then use...

  while(millis() - startTime > interval) startTime += interval;