So currently it appears the input buffer is successfully stopping that message loss (at least with the stream) but when I make some changes and try to use it in practice I am getting the same failure. I am doing a few things to try and determine if the overflow is actually occurring during run (setting up serial1 monitoring and making it so host can check the overflow by request). I will let you know what I find but I was curious if there is a technical purpose for the 5 sec delay at the beginning?
Reducing the buffer to create and detect an overflow also makes sense, I will get a python only script made to produce the bug but I do not quite understand the purpose of tracking all the sent messages since I have no problem seeing which one fails on that side?
I may have made some mistakes when trying to set it to take info from serial instead of the stream. Is this correct?
safeReader.connect(bufferedInput);
also does this check for overflow of the input buffer or the set stream?
sfStream.RxBufferOverflow()