Go Down

Topic: Serial Read and Delay (Read 3 times) previous topic - next topic


That delay(1) is in that code to compensate for the time it takes for the character to be received by the hardware USART.  You have to take into consideration that the serial data is being received at, more than likely, a slower speed than it takes for the microcontroller to process instructions.

Picture this: if the microcontroller and the USART were in a foot race, the microcontroller would win every time.

This means that you have to introduce delays to let the USART catch up to the microcontroller.

I assume that you're communicating at some speed faster than 9600 bps, because that code would not work if you were at 9600 bps or lower.  This is because it takes over 1 millisecond to receive each character at 9600 bps.

I would recommend using what I posted above, as that is a logical way of receiving the data (although it could be improved by adding time-outs and other fail-safes).



Nov 16, 2009, 08:28 pm Last Edit: Nov 16, 2009, 08:29 pm by Mart Reason: 1
well the protocol I am using is not public and I dont have access to any document.  I know it does run at 4800 and everything run fine with the delay(1)....


At 4800 bps, each byte will take over 2 milliseconds to be transmitted.  And any way you slice it, it will take ~187 milliseconds to receive 90 bytes at 4800 bps.

Here is a single byte @ 4800 bps:

Code: [Select]
void setup()


void loop()

The only way the snippet of code you posted would work is if the data has been transmitted and stored in the HardwareSerial buffer BEFORE that code is executed.  This is obvious anyway, because the loop begins with while(Serial.available()) which implies that there must be a byte available before the loop can even start.

In any case, this is just a bunch of speculation as I can't help much more unless I know what the circumstances are regarding the communications, such as when the data has been transmitted with respect to the beginning of your posted code.



Nov 16, 2009, 09:36 pm Last Edit: Nov 16, 2009, 09:39 pm by Mart Reason: 1
in the code, I am writing the request to the serial port and reading it right after with no other instruction in between.  


I just computed around ~200ms for 90 bytes as well.  The request lenght is variable depending on what I am looking for.  I am just trying to make sure I am optimized regarding the serial communication and I couldnt figure out that delay(1); ...



Without checking serial available you will read rubbish out of the input buffer and then you will transmit it. Therefore this code will constantly transmit junk and the occasional received character. Is that what you want to do?

Go Up