Pages: [1]   Go Down
Author Topic: delay microSeconds bug?  (Read 516 times)
0 Members and 1 Guest are viewing this topic.
0
Offline Offline
Sr. Member
****
Karma: 0
Posts: 267
dinosaur cork
View Profile
WWW
 Bigger Bigger  Smaller Smaller  Reset Reset

Try the code below. Then comment out the first block and comment in the second. I'm using 0008 so maybe this bug is killed but if your experience matches mine, the second loop times almost twice as fast as the first and seems to be irregular, suggesting something is rolling over.
Do other people have the same experience?

delay microseconds has its parameter defined as an unsigned int but even a signed int shouldn't be rolling over at 20000.

Also is there a good reason why delay microseconds parameter shouldn't  be defined as a long? I needed a finer grain delay and slightly longer delay when I discovered this.



int  j;

void setup() {
  Serial.begin(9600);
}

void loop(){

for (int i = 0; i < 100; i++){
 delayMicroseconds(10000); }
  Serial.println(j++, DEC);  
  
  
/*   for (int i = 0; i < 50; i++){
 delayMicroseconds(20000); }
  Serial.println(j++, DEC);  
 */
}
Logged

SF Bay Area (USA)
Offline Offline
Tesla Member
***
Karma: 106
Posts: 6378
Strongly opinionated, but not official!
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

From the code of delayMicroseconds():

Code:
     // the following loop takes a quarter of a microsecond (4 cycles)
      // per iteration, so execute it four times for each microsecond of
      // delay requested.
      us <<= 2;

So it looks to me like it will only work for arguments up to 16383.
I don't know if this is a bug, though documentation should mention
the  limit.  Surely by the time you get to 16000 uS, you should be
using delay(millisecs) instead ?

Good find, though.
Logged

SF Bay Area (USA)
Offline Offline
Tesla Member
***
Karma: 106
Posts: 6378
Strongly opinionated, but not official!
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Quote
Also is there a good reason why delay microseconds parameter shouldn't  be defined as a long?
I don't think an AVR can do a 32bit decrement loop with 1us resolution.  Although it might be close.  More complicated schemes could be used, of course.

delay() is very accurate at millisecond (well, 1.024ms?) level, since it's based on a hardware timer, IF you start and stop timing right when the counter changes.  It should be possible to put something together that divides a long microsecond count into a first part times in microseconds up till a counter transition, a second part based on the counter (which would be interruptable!) and a last part to accurately count down the final microseconds....
Logged

0
Offline Offline
Sr. Member
****
Karma: 0
Posts: 267
dinosaur cork
View Profile
WWW
 Bigger Bigger  Smaller Smaller  Reset Reset

Thanks for the info. I updated the delayMicroseconds documentation.

I also worked out the two delay scheme - which seemed slightly more complicated than it needed to be, but it's nice to know the reason for things in this case.

Paul
Logged

Pages: [1]   Go Up
Jump to: