I have a device which contains a fast loop to measure the amount of time a certain button is pushed, and how long it is nót being pushed.
they are small loops with a counter and a delay(10), so I can get a pretty good idea how long the button is pushed, or it is in rest.
Now those loops also have an instruction to write by COM port to my CM monitor so I can debug things and to get a feeling on how long I push the button myself.
Anyway...
The instruction where the text is written (it contains the actual time) takes some time, I guess.
A loop of 10 milliseconds is no longer a 10ms loop if it also has to take care of a small amount of data written to the CM monitor, is it?
Am I correct in this?
It depends. Obviously, if you're executing more instructions in your loop, it will take longer, but you could reduce your delay to compensate.
The more important factors are how fast you're running the serial port and how much data you're sending and how often. If the outgoing serial buffer is full, your code will block waiting for space to become available.
In the beginning, I'd send every 10ms cycle the actual time (as in the value of the increasing Byte) to the serial monitor.
The "time" would be roughly taken and the value of that byte was multiplied by the 10ms, so there should be a reasonable time to use further on.
If the button is pushed for a certain amount of time, with some amount of deviance, a certain action will take place.
if (Rteller >= 10 && Rteller <= 50)
{ }
Because in every 10ms loop the actual value of the Byte was sent to the Serial monitor, it became really slow.
I'd expect a whole list of an increasing value of the Byte, but this wend much slower than expected.
Yeah, delays are kind of evil when there are alternatives.
re writing to the serial port: consider that at 9600 baud, a byte, its start and stop bits (so 10 bits) take just over 1mS to clock out. The actual Serial.write() function that puts the byte in the transmit buffer takes microseconds.
If you want more accurate timing get rid of delay and use micros() timing.
OP seems to be focusing on the (probably very small) inaccuracy introduced by sending Serial data , while at the same time assuming delay() will be very accurate. I'd bet that is a poor decision, as I suspect delay most likely has a tolerance of approximately +/- 1 mSec, which means a call to delay(10) may actually delay anywhere from just over 9 to just under 11 mSec. That error alone will totally overwhelm any error due to outputting Serial data...
I'm sure someone here can confirm this with actual data...