Go Down

Topic: Arduino UNO + FreeRTOS: Timing comparison with and without rtos (Read 100 times) previous topic - next topic

ldmontibeller

I'm experimenting with FreeRTOS for my engineering thesis and I've made an interesting discover about timing functions/tasks.

  First, I am trying to compare the same (or at least very similar) functions with and without FreeRTOS Kernel and how it performs.

  So, for the simpliest comparison possible I've decided to make a fucntion that prints every 50ms (50 000us) the elapsed time using the micros() function, the delay (no FreeRTOS program) and the vTaskDelay(FreeRTOS program).

  For my surprise, neither of the codes could meet the setpoint of 50 ms (50 000 us). As expected, the program without RTOS has a greater dispersion (standard deviation) from the mean value than the program with the RTOS kernel, however the mean value of the first program was closer to 50ms than the other! The programs and charts of the data are attached.
  Why is that? Maybe is the way micros() function interacts with FreeRTOS? Someone else has an ideia?

DKWatson

Welcome to the Forum. Firstly understand that everybody has an opinion, and while it is their right and therefore valid, it is not always correct, mine included.

There is no such thing as a real-time operating system.

That is an opinion. The closest you get is a dedicated, single-thread running bare-metal. Any underlying code on a single core processor turns it into a time-share system. The argument is often made that 'things' happen so quickly it 'appears' to be running concurrent tasks in real-time. This is valid as long as your sensitivities don't drop below about 15ms, which in my experience seems to be about the typical time slice for these OSs.

As far as meeting your setpoint, you need to see what the time-slice is. If for example, it is 15ms, you'll get a hit at 60, 105, 150, ... and so on. There will often be an option to ensure tasks are executed on or about 'exactly' when they should be and this distinguishes between cooperative and pre-emptive concurrency. Also is the option of run-to-completion which can affect timing as well.

There are many factors that will influence the effectiveness of an RTOS, but you should never rely on the OS being perfect. If your setpoint is 50 and you're getting a mean of 55, change your setpoint. Remember that the processor doesn't know what you want it to do, only what you tell it to do. Sometimes you have to cheat.

While you're doing all this, have a look at RIOS. It's small (about 50 bytes) and can be utilized for either cooperative or pre-emptive multi-tasking. The nice thing with it is that it's not a library, you embed it in your code, so you have complete control over how it behaves, thus being able to tailor it to specific tasks.

Good luck with your thesis. Been there, done that.
Live as if you were to die tomorrow. Learn as if you were to live forever. - Mahatma Gandhi

westfw

Quote
For my surprise, neither of the codes could meet the setpoint of 50 ms (50 000 us).
Well, to start with you're not measuring the time taken by the delay() call; you're measuring that PLUS the loop overhead around it.


Quote
As expected, the program without RTOS has a greater dispersion (standard deviation) from the mean value than the program with the RTOS kernel
Why do you expect greater dispersion without RTOS?  The Arduino by default does NOT run a "non-real time" scheduler; it runs bare metal code that ought to be very deterministic, except for the occasional timer or UART interrupt that is usually much shorter than RTOS guarantees.
If I change your code thus:

Code: [Select]
void setup() {
  Serial.begin(9600);
}

unsigned long elaps[200];
int pos = 0;

void loop()
{
  if (pos < 198) {
    testFunction1();
  } else {
    for (int i = 0; i < 200; i++) {
      Serial.print(elaps[i]);
      Serial.print("\r\n");
    }
    while (1);
  }
}


to eliminate some of the loop overhead, it looks like I get less dispersion (which makes sense - fewer randomish uart interrupts during the timing cycle.)




Quote
however the mean value of the first program was closer to 50ms than the other! The programs and charts of the data are attached.  Why is that? Maybe is the way micros() function interacts with FreeRTOS? Someone else has an idea?


Since the RTOS values are consistently too low, I'd suspect that FreeRTOS "vTaskDelay" function has the usual bug where it actually counts "tick rollovers" rather than actual time.  This means that any time already elapsed in the current tick is ignored.  (for a millisecond tick, if you call delay at time 100.000423s, the first rollover will occur after 577 microseconds instead of a full millisecond.  Arduino's delay() had this same problem, and it was fixed in the AVR core a while ago (but is still broken in the SAM/SAMD cores:  https://github.com/arduino/ArduinoCore-samd/issues/356
I don't know offhand what the tick time in the version of FreeRTOS that you're running.

It makes sense, I think, that the Arduino bare metal code mostly shows actual numbers of "setpoint + overhead" while the FreeRTOS code shows something like "setpoint + ovehead - ticktime" (and overhead is larger.)

In general, I think an RTOS tries to guarantee response/accuracy to within time epsilon, but epsilon tends to be significantly larger than the "typical" times achieved by a bare metal or non-real-time system...

Note that micros() has a normally has a resolution of 4us, so labeling your Y-axis every 5us might not be the best idea...

Are your times really that clustered?   How many data points?  You can almost see the internal operation: "here are the delay() calls that had 8 serial interrupts occur during the 50ms, and here are the ones that had 9, and 10..."

Go Up