I'm setting up a RealBasic application to communicate with an Arduino (Uno, via USB). In the application, there are three buttons: On, Off, Flash.
Pressing one of these buttons sends a string to the Arduino, which is waiting for incoming serial data. This all works as expected. When you press a button, you see the RX light on the Arduino flash immediately. However, it takes almost a second before the LED lights up. Same with turning it off, or flashing the light - instant communication with the Arduino, but nearly 1 second before the LED reacts.
I set up a test to calculate the times it takes to execute the various steps. All of the steps are in the 2-4 microsecond range, except for Serial.readString(). But even then, it's only taking about 1/10 of a second so that doesn't account for the full second delay.
I can't test this in my actual application at the moment, so I guess my question is: should I expect nearly 1 second reaction times on digitalWrite after issuing a serial command, or is what I'm seeing something specific to Pin 13's LED?
Well, I thought it was probably Serial.readString as well, but when I tested the time on that, it was only like 17000 microseconds -- much longer than all the other steps, but still a lot shorter than the full second or so that I'm seeing.
Here's my code - pretty basic. And like I said, the RX light indicates an incoming serial signal as soon as you click on one of the buttons in the application, so I don't think it's latency in the serial connection.
but when I tested the time on that, it was only like 17000 microseconds
Not sure how you got that figure - it should be over 1 million microseconds. Unless there was an overflow problem with your variables - did you use unsigned long for your timing variables used to get values from micros()?
Anyway, looking at your code, if the command from the PC is always going to be one character, there is no need to use Strings (capital S). And they are widely frowned upon for the Arduino.
Instead, declare serialState as char. Change your if statement to:
if (Serial.available() > 0){
serialState = Serial.read();
Then change your comparisons like this:
if (serialState == '1') { // single quotes around a character constant
I was sending a string because that appears to be the only type that I can send in RealBasic. From that end, I don't really care if I'm sending a number, a string or whatever, but the only way I could get the Arduino to respond correctly was with Serial.readString().
So the way it is now, per Hackscribble's suggestion, I'm still sending a numerical code as a string from the application, but Arduino is treating it as a char, using Serial.read(). It's all working basically instantaneously now.
The person who decided to add the String class to the Arduino should be ashamed of himself - I recommend you stop using it.
The readString() method you're using is implemented by the Stream class.
readString()
Description
readString() reads characters from a stream into a string. The function terminates if it times out (see setTimeout()).
Description
setTimeout() sets the maximum milliseconds to wait for stream data, it defaults to 1000 milliseconds.
As you can see, the method will not return until a second after receiving the last character, by default.
Since you only want to receive one character, read it using Serial.read() and hold it in a char variable. It's much simpler, avoids various problems associated with the String class, and gets rid of the unwanted delay.
Using the serial monitor, the below code turns the arduino pin 13 LED on/off pretty quick after the appropriate string is sent.
// zoomkat 8-6-10 serial I/O string test
// type a string in serial monitor. then send or enter
// for IDE 0019 and later
//A very simple example of sending a string of characters
//from the serial monitor, capturing the individual
//characters into a String, then evaluating the contents
//of the String to possibly perform an action (on/off board LED).
int ledPin = 13;
String readString;
void setup() {
Serial.begin(9600);
pinMode(ledPin, OUTPUT);
Serial.println("serial on/off test 0021"); // so I can keep track
}
void loop() {
while (Serial.available()) {
delay(3);
char c = Serial.read();
readString += c;
}
if (readString.length() >0) {
Serial.println(readString);
if(readString.indexOf("on") >=0)
{
digitalWrite(ledPin, HIGH);
}
if(readString.indexOf("off") >=0)
{
digitalWrite(ledPin, LOW);
}
readString="";
}
}
You keep offering up this peculiar method for handling serial input, and people keep telling you why it is a poor approach. Your code assumes that characters arrive at the expected rate and that there is a big enough delay between adjacent messages to trigger your inter-character timeout. The inter-character timing relies on magic delays in your code which have to be hand-tweaked to match the serial line speed. It uses the problematic Serial class, for no particular benefit. This is simply not a good general way to handle serial input.
It's already been explained why the delays were occurring in the original code. The goals of the original project can be met very simply by reading one character at a time using Serial.read() and then processing the received character. The changes required are minimal - it just means using characters instead of Strings. This problem doesn't require messages to be buffered at all, let alone using such a peculiar technique.
You keep offering up this peculiar method for handling serial input, and people keep telling you why it is a poor approach.
Come on now, you are just upset because the little piece of serial code I posted demonstrates the String class you were bashing probably isn't the reason for the delay. 8)
Thanks for the responses. I've been messing with this since last night, and it's lightning fast treating the incoming string as a char, plus the code change was minimal, so that's how I think we'll do this.