Trouble understanding delay

Hello guys, I am having some troubles understanding the sampling of the measurements on Arduino. I have a simple code that reads a memsic accelerometer data every 10 msec, but when I compare with a clock after, say a couple of hundreds iterations, the total time is way off. This is the arduino sketch.

const int xPin = 2; int pulseX; int accelerationX; void setup() { Serial.begin(9600); pinMode(xPin, INPUT); }

void loop() { pulseX = pulseIn(xPin,HIGH); accelerationX = ((pulseX / 10) - 500) * 8; Serial.print(accelerationX); Serial.print(" \n"); delay(10); }

I tried to read the serial port with python, and get the elapsed time there with this code

from future import print_function import serial import sys import time

if name=='main': fname = sys.argv[1] ser = serial.Serial('/dev/tty.usbmodem1421', 9600) with open(fname,'w') as f: t = time.time() while True: s = ser.readline() print((time.time()-t,s)) print(s,end="",file=f) t = time.time() ser.close()

and this is the output of the code

(0.02042984962463379, '16 \n') (0.020503997802734375, '16 \n') (0.016204833984375, '16 \n') (0.02043294906616211, '16 \n') (0.020401954650878906, '16 \n') (0.020617008209228516, '16 \n') (0.02014613151550293, '16 \n') (0.020604848861694336, '24 \n') (0.02014899253845215, '16 \n') (0.0205228328704834, '16 \n') (0.01622796058654785, '16 \n') (0.020494937896728516, '16 \n') (0.020478010177612305, '16 \n') (0.02043604850769043, '16 \n') (0.020114898681640625, '16 \n') (0.0205841064453125, '16 \n') (0.02036595344543457, '16 \n') (0.020672082901000977, '16 \n') (0.016162872314453125, '24 \n') (0.020676851272583008, '16 \n') (0.02024102210998535, '16 \n') (0.020776033401489258, '16 \n') (0.020125865936279297, '16 \n')

as you can see, the elapsed time between measurements is in average about 20 msec. What am I doing wrong? and how do I get measurements sampled exactly at 10 msec?

Thanks in advance,

Andrea

Think about it.... If you put in a 10ms delay into your code, you assume that doing the measurement (and printing it to serial) takes no time at all.

Take the serial code out of the loop. Then, remove the delay() completely. Look at the BlinkWihoutDelay (or something) example and use that instead to be more flexible on the timing.

OK, I understand that taking the measurement and writing to serial takes time. I have experimented with different numbers of readings and write to serial, which affected the final elapsed time. I get that!

But how do I make sure that I take the measurements "exactly" at the desired time interval? Is there a library that would allow me to sync my measurements with a variable delay? Or else how?

How do people generally address this problem?

Andrea

Generally, they look at blink without delay for ideas.

OK, lets try an analogy to get the idea behind “blink without delay()”:

Say you have to flip a switch every minute. You also have to measure the amount of rain fallling outside (we’ll assume it is raining).

To measere the rain fall you hold a small measuring cup and time how long it takes to fill. This takes between 2 and 20 seconds, depending on the rain.

Then you look at watch and wait exactly 60 seconds, then flip the switch. Will you flip the switch ever 60 seconds ? This is what your program does.

I am sure if you were asked to do the problem you find a good way to solve it wth looking at he watch at several points in the “program”. This is what millis() does. And you can work out when 60 seconds have passed. This is what mills() - start > length does.

Hope this helps

ohh ... that was helpful mSquare!! ... I believe I have now got it: millis() is the function I was looking for.

Thanks for helping out an arduino noob :)

Andrea

yep. the code: if (millis() - start > desired_interval) { // do something

}

is equivalent to "Are we there yet?"

I’ve never liked all blink-without-delay stuff.
Yeah you can make it work that way, but why?
There are libraries out there that will handle it in a much cleaner fashion.
Some libraries still depend on polling, but some use interrupts so your time critical
code can interrupt your foreground task and happen when it is supposed to happen,
which to me is the way things should be done.

For code that uses interrupts see the MsTimer2 and FlexiTimer2 libraries:
http://www.pjrc.com/teensy/td_libs_MsTimer2.html

For a better/easier way to handle scheduling than having to monkey
with the blink-without-delay style of polling millis() values yourself, see the
SimpleTimer library:
http://playground.arduino.cc//Code/SimpleTimer

or the Metro library:
http://www.pjrc.com/teensy/td_libs_Metro.html

— bill

The advantage of using millis() is that you can see and understand what is going on. With a library, although you can look at the source most people don't, so there may be a conflict, particularly when it come to the use of timers by a library interfering with other functions.

How many times have you seen a question along the lines of "I am using the servo library but my motor speed control does not work". The normal servo library prevents use of PWM on pins 9 and 10, which is at least documented, but the inner workings of libraries can be a mystery.

In any case, compared with the complications of much other code, using millis() is simple once the concept is grasped, which a Real World helps to do, and using meaningful variable names helps enormously.

Idiom for regularly performing a task is this:

unsigned long last_time ;

#define DELAY_PERIOD  10

void setup ()
{
  ...
  last_time = millis () ;
}

void loop ()
{
  if (millis () - last_time >= DELAY_PERIOD)
  {
    last_time += DELAY_PERIOD ;
    ... do the thing here ...
  }
  ...
}

This code relies on the task taking less time than the DELAY_PERIOD (on average). It keeps
time as accurately as millis() can determine it in the long run.