This code works fine, Arduino prints back every character I send until '\n', regardless the size of string.
However, if I uncomment delay(200) it starts to truncate the last characters when I send messages with around 70 characters.
It seems Arduino can't deal with a long message (around 70 characters) if the message arrives while Arduino is in the middle of a delay().
I tried it and it is weird!
I put a serial.println after the string and it really exits before the completition of the string. It may be a bug, tied to some interrupt set by the compiler for the delay(), but i am not an expert, just guessing.
Why don't you get char by char from the serial?
This code works, you can select the max length of your message, or it is too long the rest will be processed the next time the function runs (i have not made this function, i found it somewhere and adapted to my use)
void setup() {
Serial.begin(57600);
}
int maxcharacters = 50;
void loop() {
char txtMsg[maxcharacters];
readSerial(txtMsg, maxcharacters);
Serial.println(txtMsg);
delay(200);
}
int readSerial(char result[], int maxchars)
{
int i = 0;
while (1)
{
while (Serial.available() > 0)
{
char inChar = Serial.read();
if (inChar == '\n' || i == maxchars)
{
result[i] = '\0';
Serial.flush();
return 0;
}
if (inChar != '\r')
{
result[i] = inChar;
i++;
}
}
}
}
delay(...) is bad because your Arduino can do little else until the delay(...) is over. Avoid it!
One thing that an Arduino can handle during delay(...) is interrupts. Incoming characters on the serial port are handled by interrupts.
HOWEVER...
The incoming characters are handled by being placed in a buffer. On a limited-memory processor like the Arduino, the buffer is typically limited to about 64 characters. Anything beyond the buffer size once it is filled during a delay(...) is simply thrown away. It is the job of your program to pull characters out of this buffer as fast as possible. It cannot do that if it is executing a blocking operation like delay(...).
Interrupt service routines can be nasty to debug. I do NOT recommend them for beginners.
One way to handle this issue is to modify the library and increase the buffer size. Yuck. A better way is to avoid delay(...) or anything else that blocks execution for a period of time. Look for the "Blink Without Delay" example and/or learn about Finite State Machines. These are MUCH BETTER than delay(...).
There has been a discussion about Serial parseInt method which has a peculiar "feature" to wait for programmable timeout before it completes.
The Serial doc does not say anything about such feature in readStringUntil method, but I would suggest to dig into the source code to make sure such "wait for completion" timeout feature is not hidden somewhere.
But as odd as it appears, it sure looks as the Serial buffer in not being emptied after each character is processed ( inside readStringUntil method) hence the buffer will be overrun while the readStringUntil method is being processed. Again, the answer should be in the source code of the method.
After all, the incoming data is being added to the Serial buffer (by external processor) irregardless / independently of how the buffer is being emptied by any Serial method, or if it is being emptied at all.
My recommended hack would be to increase the size of the Serial buffer to accommodate known length of incoming data.
But preferred / safer way would be to use "read" method and store the retrieved data as long as it is available in a temporary variable (array) and process it when the serial buffer is empty for sure.
I am also against using if(Serial.available()) - or similar methods - in loop() function.
Usually the program waits for serial data and waiting in loop(), which cannot be controlled by the user code without jumping thru the hoops of millis() code , makes little sense.
If you going to wait for data (anyway) - use while() to have control over the wait process.