Serial Read and Delay

Hi,

I am sending 30 bytes with Serial.print() and receiving 90 bytes ( 30B + 60B Response ).

I use the following to retrieve data:

while(Serial.available())
{
delay(1);
b=Serial.read();
if(b != -1 )
{
data[k++] = b;
}
}

and it work perfectly. If I removed the delay(1), the available() return false and the function return.

This is adding a 90ms delay to an already way to long function :frowning:

Anyone can explain me why that delay(1) is needed or should I adopt another strategy for writing/reading?

I can't explain the need for the delay, but, Serial.available() does not return a boolean. It returns an int that says how many bytes are available to read. If you made use of that value, you would not need to call Serial.available() as many times. Function calls are expensive. Don't make ones you don't need to.

does not return a boolean. It returns an int that says how many bytes are available to read

In 'C' zero (no bytes available) is equivalent to FALSE and non-zero (1 or more bytes available) evaluates to TRUE, so there is nothing wrong with the condition as written, it just might confuse a newbie.

The test for -1 should catch any bad reads without the need for the delay.
The only thing I can think of is that the rx buffer is only 64 bytes long, and the head/tail pointer handling is at fault, but it seems unlikely

As with all serial communication, you need to have a protocol. It seems in your case, you want to receive 90 bytes. So this section of your protocol is going to require you to receive 90 bytes (no more, no less).

Thus, here is one way you could attack this:

char data[90];
int i = 0;
while (i < 90)
{
  while (!Serial.available()) { /* Do nothing. Just wait until we have a byte */ }
  data[i++] = Serial.read();
}

b

bhagman: you are right. this is my default option if I cant figure why delay(1) is needed....

That delay(1) is in that code to compensate for the time it takes for the character to be received by the hardware USART. You have to take into consideration that the serial data is being received at, more than likely, a slower speed than it takes for the microcontroller to process instructions.

Picture this: if the microcontroller and the USART were in a foot race, the microcontroller would win every time.

This means that you have to introduce delays to let the USART catch up to the microcontroller.

I assume that you're communicating at some speed faster than 9600 bps, because that code would not work if you were at 9600 bps or lower. This is because it takes over 1 millisecond to receive each character at 9600 bps.

I would recommend using what I posted above, as that is a logical way of receiving the data (although it could be improved by adding time-outs and other fail-safes).

b

well the protocol I am using is not public and I dont have access to any document. I know it does run at 4800 and everything run fine with the delay(1)....

At 4800 bps, each byte will take over 2 milliseconds to be transmitted. And any way you slice it, it will take ~187 milliseconds to receive 90 bytes at 4800 bps.

Here is a single byte @ 4800 bps:

void setup()
{
  Serial.begin(4800);

  Serial.print('\0');
  Serial.print('U');
  Serial.print('\0');
}

void loop()
{
}

The only way the snippet of code you posted would work is if the data has been transmitted and stored in the HardwareSerial buffer BEFORE that code is executed. This is obvious anyway, because the loop begins with while(Serial.available()) which implies that there must be a byte available before the loop can even start.

In any case, this is just a bunch of speculation as I can't help much more unless I know what the circumstances are regarding the communications, such as when the data has been transmitted with respect to the beginning of your posted code.

b

in the code, I am writing the request to the serial port and reading it right after with no other instruction in between.

while(1)
{
SerialWrite();
SerialRead();
DisplayData();
};

I just computed around ~200ms for 90 bytes as well. The request lenght is variable depending on what I am looking for. I am just trying to make sure I am optimized regarding the serial communication and I couldnt figure out that delay(1); ...

thanx

Without checking serial available you will read rubbish out of the input buffer and then you will transmit it. Therefore this code will constantly transmit junk and the occasional received character. Is that what you want to do?

you will read rubbish out of the input buffer

It won't be rubbish, it'll be a constant -1, which should make it pretty easy to spot.
But yes, use of "Serial.available ()" is recommended.