Delay of 1ms necessary between Serial.read and Serial.write?

We have a Razor IMU (10736) sensor board connected to a RS-485 Breakout board, talking RS485 to the computer via a USB-RS485 Serial Interface. We use the Arduino IDE to program Razor's on-board ATmega328 (code is based on the Razor AHRS open source firmware).

Everything works fine, but for one detail: The Arduino-programmed Razor seems to need a small delay between the reading-in and the writing-out of a serial message. The way it works, is: the computer polls the razor, the razor reads the request and then responds by reporting it's data. The code looks like this (simplified) :

if(Serial.available() > 0) {

  int incoming = (int) Serial.read();    // read data
  delay(1);                              // ??? why necessary ???
  digitalWrite(RTS_PIN, HIGH);           // RE high, sets RS485-board to transmit
  Serial.write(c);                       // send data
  delay(1);
  digitalWrite(RTS_PIN, LOW);            // RE low, sets RS485-board to receive

}

There is the need for a delay of minimum 1 millisecond between executing Serial.read() and any form of output (digitalWrite and/or Serial.write()). We tried setting the value lower into the microseconds range, but always got corrupted data packages as a result. This problem does not occur when you only poll the razor once in a while, but when you repeatedly poll it at short time intervals (the computer waits for a response and then automatically triggers a new polling). This all is happening at a baud rate of 76800 right now.

Next we did tests by programming the Razor's on-board ATmega328 directly with C code. And this eliminated the need for the delay value! Reading and writing serial data could happen right after each other. But - as we'd like to stay within the Arduino language (for several reasons) - we'd like to figure how we could solve this problem in other ways.

In the Arduino firmware, the Serial library, is there any reason why there would have to be a delay of 1ms between Serial input and Serial output? And if yes, are there ways of changing that?

I am just taking a guess but, I think you are waiting on this part of the code to make the shield work. I am suspecting a hardware delay that requires time to take action. You set the pin HIGH but, the hardware needs a given amount of time to be able to accept serial comms.

digitalWrite(RTS_PIN, HIGH);           // RE high, sets RS485-board to transmit

But - If it would be an hardware issue, we would have the same problem with the C code as well.

evsc:
But - If it would be an hardware issue, we would have the same problem with the C code as well.

I am not sure what you are saying. I can say because you do this-

digitalWrite(RTS_PIN, LOW);  // RE low, sets RS485-board to receive

then you have more code to go through before you receive again, you essentially have a delay built in before you receive.

You might not actually need a full millisecond delay to transmit, have you tested a delay microseconds, starting with 500?

Yes, i tried with microsecond-delays, different values, also 500us.
But i had to go back up to 1000us to get it to respond properly.

digitalWrite(RTS_PIN, HIGH); // RE high, sets RS485-board to transmit
Serial.write(c); // send data
delay(1);
digitalWrite(RTS_PIN, LOW);

It looks like a perfect hardware handshake to me.

What is the baudrate you are using? let me guess 9600 baud. (hint: at 9600 baud it takes about 1 ms to send 1 byte )

So you might try a higher baudrate?

I am at a baud rate of 76800.

In principal, if there is a delay necessary, i can live with that.
BUT it is not necessary with C code.
What does Arduino add to the whole mix, that makes it necessary to include this delay??

Buffered writes. Serial.write puts the byte in an outbound queue and returns. digitalWrite(RTS_PIN, LOW); shuts off the transmitter before the byte is sent.

Test your application with Arduino 0023; it has blocking writes. If your application works then the new buffered writes in Arduino 1.0 are interfering.

1.0 has a Serial.flush() to ensure all the characters you sent have been actually "sent".
However, after looking at the code in the 1.0 HardwareSerial.cpp
It appears that it really doesn't wait for the last character to be sent out.
It merely waits for the s/w TX buffer to be empty.
This is not the same as all the characters have been transmitted.
(The last character might still be in the UART)

--- bill

Buffering (in or out of the uart) explains the need for a delay(1) after the transmit. IMO, no one has explained the delay(1) after the serial.read() !

The transmit delay is not unique to Arduino. Most uart-like devices on most microcontrollers will need something similar.

The code looks like this (simplified)

That looks good to me, maybe it's in the stuff you removed.

Are you only expecting one character from the Razor? If you use serial.println() on the Razor that would be a problem.

Can you show us the Razor code?


Rob

Thanks all for the suggestions!

I tested the same setup with Arduino 0022 and now it works without the delay(1) between Serial.read and Serial.write!!

if(Serial.available() > 0) {

      int incoming = (int) Serial.read();    // read data
      digitalWrite(RTS_PIN, HIGH);           // RE high, sets RS485-board to transmit
      Serial.write(c);                       // send data
      delay(1);
      digitalWrite(RTS_PIN, LOW);            // RE low, sets RS485-board to receive

}

Amazing!
Thanks!