Accuracy of Arduino delay() command

Hello all. Not sure if this is in the correct place, sorry if it's not...

I have been searching for a few hours on this, but cannot seem to find a reasonable answer to this.

Basically, I am doing a assignment and am using short(ish) pulses, from 1ms to 100ms. I have two questions:

  • Is there any way to get shorter pulses, such as 0.5 or even 0.2ms pulses?
    and
  • how accurate is the pulse command?

The first one is not crucial, I can live with 1ms pulses.

However, as this is a formal lab report, I need to state my error margins. To what extent is the delay command accurate? I would imagine something like 0.01ms, however I really have no clue on what it might be.

Any help would be greatly appreciated.

Thanks in advance,
Fudge.

Is there any way to get shorter pulses, such as 0.5 or even 0.2ms pulses?

delayMicroseconds

how accurate is the pulse command?

Which "pulse commnad"? pulseIn?

Oops... my bad. Proofreading failure.

By pulse command, I meant delay. Basically, my code turns on, delays for however many milliseconds, then turns off. How accurate is the delay command?

for shorter pulses you may need to use direct port manipulation as digitalWrite itself takes some time too. Or use digitalWriteFast (google for it on the forum) which does just that but works only for fixed pins.

How accurate is the delay command?

For long time spans, it is very accurate. For short time spans, not so much. Given your concern for accuracy, you should be using delayMicroseconds and, as @robtillaart mentioned, direct port manipulation.

To assess accuracy you need an external time standard.

As for the rest... what board, what speed what crystal? is it 8MHZ or 16MHz...

Not sure how to help.

I believe delay() has 1ms jitter as it just waits for millis() to increase by the appropriate value.

delayMicroseconds() is good but disables interrupts...

Suggest you compromise between the two and have a loop waiting for micros() to increase - jitter/granularity will be poorer than delayMicroseconds() but doesn't stop interrupts.

long target = micros() + delay_in_us ;
while (micros () < target) {}

The delay() function is accurate to within +/- 4us for 16MHz Arduino's and to within +/- 8us for the 8MHz flavor irrespective of delay duration.

The delayMicroseconds function can be used for delays in the sub millisecond range. Exact timing however requires interrupts to be disabled (you have to code for this) when delayMicroseconds is running.

Ah, I was wrong - delay() already uses the approach I gave - so the jitter is much less than 1ms. The accuracy of delay() is affected by any slow interrupt handlers that you have running though, note.

RE: delay ... For long time spans, it is very accurate. For short time spans, not so much.

What was I thinking! That is definitely no longer correct. My apologies.

delay() was changed in the 0020 release (first version supporting Uno.)
I'm actually not all that fond of the change. It's bigger and less elegant, and I think it's no longer true that the time read from millis() will differ by N after you delay() for N milliseconds (though I'm not sure where that would matter.)

westfw:
I'm actually not all that fond of the change. It's bigger and less elegant, and I think it's no longer true that the time read from millis() will differ by N after you delay() for N milliseconds (though I'm not sure where that would matter.)

I believe we concluded at the time that the new delay() function was shorter by 20 bytes than what we replaced and also that accuracy improved from +/- 2ms to +/- 4us (16MHz). There were a number of posts complaining about short delay inaccuracy prior to the change and I haven't seen any since.

The millis() difference after delay(N) was as much an issue before as it is now (+/- 2ms accuracy).

So then what is the better general purpose millisecond delay() function?