I am consistently getting the following results using delayMicroseconds:
set value
actual
error
deviation
16000
16228
1.4%
8 .04%
8000
8112
1.4%
8 .09%
4000
4052
1.3%
4 .1%
The percentage of error is roughly the same in each case (~1.4%) so it appears I am seeing lack of accuracy rather than some offset.
This the the test code:
unsigned int before = 0;
unsigned int after = 0;
before = micros();
delayMicroseconds(16000);
after = micros();
Serial.print("time: ");
Serial.println(after - before);
To check how much affect the micros() call was having, I inserted 4 more micros() calls before the "after" read, and for example, the delayMicroseconds(16000) test the recorded value was 16240, increasing error from 1.4% to 1.5%, thus showing the micros() call had little bearing on the results.
As I examine results for lower timings, 200, 100 and 50 respectively, I observe the error increases disproportionately and so does the percentage of deviation between results:
value
avg
error
deviation
200
202.6
1.3%
8 3.9%
100
102.6
2.6%
8 7.8%
50
53.3
6.6%
4 7.5%
I am just wondering is this amount of error that I should expect to see, or is there something I am not interpreting correctly.
You are asking too much from this function. First off, microseconds is always a multiple of 4 usec (Uno/Mega/etc. maybe not on others) so there is that error.
Second, interrupts are running in the background and can inject variability as well.
Third - what are you trying to do that requires such accuracy? The most accurate means would be to set up a timer interrupt which will be spot on (unless you turn off interrupts and do other nasty things)
I have taken the measure of time elapsed and the difference from time desired to make an adjust to actual 'delay' time. Instructions like serial print are clock tick consumers.
The ESP32 chip contains two hardware timer groups. Each group has two general-purpose hardware timers. They are all 64-bit generic timers based on 16-bit prescalers and 64-bit up / down counters which are capable of being auto-reloaded.
delayMicroseconds() is a cycle-counting loop, so it doesn't include any time taken by background functions like interrupts. Any delay over 1024us will include timer0 interrupts, and in your example you'll also have serial interrupts. You should see different times if you wait for the serial output to finish:
before = micros();
delayMicroseconds(16000);
after = micros();
Serial.print("time: ");
Serial.println(after - before);
delay(5000); // wait for serial output to complete.
Oh - the AVR also has hardware timers that can be used to achieve more accurate delays, but by default they're configured for use by analogWrite(). Search out the "timer1" library.