For my surprise, neither of the codes could meet the setpoint of 50 ms (50 000 us).
Well, to start with you're not measuring the time taken by the delay() call; you're measuring that PLUS the loop overhead around it.
As expected, the program without RTOS has a greater dispersion (standard deviation) from the mean value than the program with the RTOS kernel
Why do you expect greater dispersion without RTOS? The Arduino by default does NOT run a "non-real time" scheduler; it runs bare metal code that ought to be very deterministic, except for the occasional timer or UART interrupt that is usually much shorter than RTOS guarantees.
If I change your code thus:
void setup() {
Serial.begin(9600);
}
unsigned long elaps[200];
int pos = 0;
void loop()
{
if (pos < 198) {
testFunction1();
} else {
for (int i = 0; i < 200; i++) {
Serial.print(elaps[i]);
Serial.print("\r\n");
}
while (1);
}
}
to eliminate some of the loop overhead, it looks like I get less dispersion (which makes sense - fewer randomish uart interrupts during the timing cycle.)
however the mean value of the first program was closer to 50ms than the other! The programs and charts of the data are attached. Why is that? Maybe is the way micros() function interacts with FreeRTOS? Someone else has an idea?
Since the RTOS values are consistently too low, I'd suspect that FreeRTOS "vTaskDelay" function has the usual bug where it actually counts "tick rollovers" rather than actual time. This means that any time already elapsed in the current tick is ignored. (for a millisecond tick, if you call delay at time 100.000423s, the first rollover will occur after 577 microseconds instead of a full millisecond. Arduino's delay() had this same problem, and it was fixed in the AVR core a while ago (but is still broken in the SAM/SAMD cores: delay() is inaccurate, averaging 500us short. · Issue #356 · arduino/ArduinoCore-samd · GitHub
I don't know offhand what the tick time in the version of FreeRTOS that you're running.
It makes sense, I think, that the Arduino bare metal code mostly shows actual numbers of "setpoint + overhead" while the FreeRTOS code shows something like "setpoint + ovehead - ticktime" (and overhead is larger.)
In general, I think an RTOS tries to guarantee response/accuracy to within time epsilon, but epsilon tends to be significantly larger than the "typical" times achieved by a bare metal or non-real-time system...
Note that micros() has a normally has a resolution of 4us, so labeling your Y-axis every 5us might not be the best idea...
Are your times really that clustered? How many data points? You can almost see the internal operation: "here are the delay() calls that had 8 serial interrupts occur during the 50ms, and here are the ones that had 9, and 10..."