Arduino and libserial - strange behaviour

Hi. I’m trying to read data from Arduino in a completely non-blocking manner in C++ using libserial (non-blocking is essential because this will be used inside a real-time video processing program)

To test and learn, I’ve done the following-
Arduino puts a ‘?’ character on the serial line every 2.5 seconds. The C++ program checks the serial port in a while(1) loop and tells me via cout when it gets a ‘?’. Otherwise it just couts ‘x’. Sounds simple enough.

Here’s my C++ code-

#include <SerialStream.h>
#include <string>
#include <iostream>
#include <fstream>

using namespace std;
using namespace LibSerial;

int main(){


 SerialStream my_serial_stream;

 my_serial_stream.Open( "/dev/ttyUSB0") ;
 my_serial_stream.SetBaudRate( SerialStreamBuf::BAUD_9600 ) ;

 my_serial_stream.SetVTime(1);
 my_serial_stream.SetVMin(0);

while (true)
{
    char c = 'x';
    my_serial_stream >> c;
    std::cout << c << std::endl;
    if (c == '?') std::cout << "hahaha got it!" << std::endl;
}


 my_serial_stream.Close();
 return 0;
}

And the very simple Arduino code-

void setup()
{
Serial.begin(9600);
delay(500);
}

void loop()
{
Serial.write(’?’);
delay(2500);
}

My problem is that it never shows me that it got a ‘?’, not for vtime = 1. Arduino and serial are working fine, because as soon as I make vtime = 25 or more, the '?'s start getting detected. But of course, that is useless for me because that blocks my program for 2.5 seconds.

What I need is for the code to move on if there’s nothing on the serial port, and tell me if there’s a ‘?’.

Any help? Thanks!

Update-
As soon as I make vtime less than 25, the '?'s stop being detected. This sort of makes sense, since vtime = 25 means the program waits for 2.5 s per loop, 24 → 2.4 s and so on.

But, in my limited experience, this sort of behaviour means that incoming bytes are discarded if there’s nothing at that exact time to read them, which is very wierd.

This was posted a long time ago but I had a similar issue and not sure of the original poster solved this so maybe this helps someone else. The problem for me was in the output (std::cout) to the terminal. If you leave the vtime settings as default and simply initialise the way you have then all you need to do is change your std::endl; to std::flush;

Flush clears out whatever data is in the pipe whereas endl will simply add a newline/carriage return character. Because Serial is effectively a stream of characters, it means std::endl; is not enough to tell std::cout to put it on screen. To write it we need to std::flush.

(For the record I'm still also learning so if anyone wants to correct my own interpretation of meaning then go right ahead.)

(For the record I'm still also learning so if anyone wants to correct my own interpretation of meaning then go right ahead.)

The only thing wrong in what you say is that flush does NOT add the carriage return/line feed that endl does. To get the carriage return/line feed and force the output to happen now, you need both endl and flush.

Awesome thanks for making that clear. Just wondering why mine worked without the endl; If the output has a newline/carriage return does this in combination with the flush invoke output as you mentioned. So my question is if the serial output already contains a newline/carriage return is endl; still necessary?

So my question is if the serial output already contains a newline/carriage return is endl; still necessary?

No. The purpose of endl is to supply the carriage return/line feed (in the proper order and quantity for the operating system it is running on).