We have a Razor IMU (10736) sensor board connected to a RS-485 Breakout board, talking RS485 to the computer via a USB-RS485 Serial Interface. We use the Arduino IDE to program Razor's on-board ATmega328 (code is based on the Razor AHRS open source firmware).
Everything works fine, but for one detail: The Arduino-programmed Razor seems to need a small delay between the reading-in and the writing-out of a serial message. The way it works, is: the computer polls the razor, the razor reads the request and then responds by reporting it's data. The code looks like this (simplified) :
if(Serial.available() > 0) {
int incoming = (int) Serial.read(); // read data
delay(1); // ??? why necessary ???
digitalWrite(RTS_PIN, HIGH); // RE high, sets RS485-board to transmit
Serial.write(c); // send data
delay(1);
digitalWrite(RTS_PIN, LOW); // RE low, sets RS485-board to receive
}
There is the need for a delay of minimum 1 millisecond between executing Serial.read() and any form of output (digitalWrite and/or Serial.write()). We tried setting the value lower into the microseconds range, but always got corrupted data packages as a result. This problem does not occur when you only poll the razor once in a while, but when you repeatedly poll it at short time intervals (the computer waits for a response and then automatically triggers a new polling). This all is happening at a baud rate of 76800 right now.
Next we did tests by programming the Razor's on-board ATmega328 directly with C code. And this eliminated the need for the delay value! Reading and writing serial data could happen right after each other. But - as we'd like to stay within the Arduino language (for several reasons) - we'd like to figure how we could solve this problem in other ways.
In the Arduino firmware, the Serial library, is there any reason why there would have to be a delay of 1ms between Serial input and Serial output? And if yes, are there ways of changing that?
I am just taking a guess but, I think you are waiting on this part of the code to make the shield work. I am suspecting a hardware delay that requires time to take action. You set the pin HIGH but, the hardware needs a given amount of time to be able to accept serial comms.
digitalWrite(RTS_PIN, HIGH); // RE high, sets RS485-board to transmit
In principal, if there is a delay necessary, i can live with that.
BUT it is not necessary with C code.
What does Arduino add to the whole mix, that makes it necessary to include this delay??
Buffered writes. Serial.write puts the byte in an outbound queue and returns. digitalWrite(RTS_PIN, LOW); shuts off the transmitter before the byte is sent.
Test your application with Arduino 0023; it has blocking writes. If your application works then the new buffered writes in Arduino 1.0 are interfering.
1.0 has a Serial.flush() to ensure all the characters you sent have been actually "sent".
However, after looking at the code in the 1.0 HardwareSerial.cpp
It appears that it really doesn't wait for the last character to be sent out.
It merely waits for the s/w TX buffer to be empty.
This is not the same as all the characters have been transmitted.
(The last character might still be in the UART)
Buffering (in or out of the uart) explains the need for a delay(1) after the transmit. IMO, no one has explained the delay(1) after the serial.read() !
The transmit delay is not unique to Arduino. Most uart-like devices on most microcontrollers will need something similar.