Weird Serial Port Behavoir

Hello. I am trying to set up serial communication between my Arduino Uno and a .NET program through the Uno's USB serial port. For some reason, whenever I open the port inside of .NET, a few random bytes (0xF0 or 0xFF) appear on the Uno's receive buffer. I've tried it with all baud rates using the following port settings in .NET:

Handshake: None
Parity: None
Stop Bits: 1
Data Bits: 8
RTS Enabled: tried both true and false
DTR Enabled: tried both true and false

I set up a very simple sketch to send back whatever bytes are received on the Uno:

void setup()
{
  Serial.begin(14400);
}

void loop()
{
  int N = Serial.available();
  if(N)
  {
    byte* buffer = new byte[N];
    Serial.readBytes((char*)buffer, N);
      Serial.write(buffer, N);
    delete buffer;
  }
}

Once I open the port in .NET, the bytes appear on the Uno receive buffer and are sent back to the .NET program. Here is the .NET code:

        Dim mSerial As New System.IO.Ports.SerialPort("COM4", 14400)

        mSerial.DtrEnable = False
        mSerial.RtsEnable = False
        mSerial.Handshake = IO.Ports.Handshake.None
        mSerial.Parity = IO.Ports.Parity.None
        mSerial.StopBits = IO.Ports.StopBits.One

        mSerial.Open()

        Threading.Thread.Sleep(50)

        Dim rec(128) As Byte
        Dim NRead As Integer = mSerial.Read(rec, 0, mSerial.BytesToRead)

When I execute this code, I get random responses depending on what the baud rate is. For example, 14400 gives 1-4 bytes of 0xFF, while 115200 gives 3 bytes of 0xF0.

Any ideas what's going on here?

Your arduino sketch looks a bit unconventional, but I can't see anything actually wrong with it. The dynamic allocation of memory what you are doing there, is unnecessary and possibly hazardous on the arduino and I would really not do it that way myself. I'd try just echoing one character at a time.

Are you sure the usb is actually COM4: ? You can check with device manager.

Also, I don't understand how you .net program is actually trying to send anything to the UNO ? Shouldn't you be trying to write out something identifiable , and then see if you get it back ? I don't see where that happens.

Thanks for the quick response. I'm positive that it's COM4, and the .NET code specifically doesn't use a serial write command because I wanted to demonstrate the weird behavior of the Uno. The Uno is somehow receiving bytes even though I'm not sending it anything. It only receives these phantom bytes when the COM port is opened on the PC it is connected to.

I've done some more testing, and found that it is dependent on the baud rate. The .NET code causes phantom bytes to appear when opening the port with any baud rate above 9600. The Arduino IDE Serial Monitor only causes this behavior on 14400 as far as I can tell.

On an unrelated note (and I hope it doesnt hijack this thread topic haha), why is allocating the memory like that bad for the arduino? Also, I just noticed I forgot the [] in the delete haha. EDIT: I just did a little research and realized that allocating /deallocating objects on the heap inside of a loop will quickly cause it to become fragmented.

pcdangio:
I've done some more testing, and found that it is dependent on the baud rate. The .NET code causes phantom bytes to appear when opening the port with any baud rate above 9600. The Arduino IDE Serial Monitor only causes this behavior on 14400 as far as I can tell.

What version of the IDE are you using. I can't set the Serial Monitor to 14400. It's not in the list.

I'm running 1.5.4

Very simple testc code that echoes what is sent from the serial monitor back to the serial monitor. If it works with the serial monitor, then try the .net code.

// zoomkat 7-30-11 serial I/O string test
// type a string in serial monitor. then send or enter
// for IDE 0019 and later

String readString;

void setup() {
  Serial.begin(9600);
  Serial.println("serial test 0021"); // so I can keep track of what is loaded
}

void loop() {

  while (Serial.available()) {
    delay(2);  //delay to allow byte to arrive in input buffer
    char c = Serial.read();
    readString += c;
  }

  if (readString.length() >0) {
    Serial.println(readString);

    readString="";
  } 
}

I'd recommend not using String, also.

I'd conjecture that when you start up a serial connection, you might get a few characters of random garbage just from whatever static electricity is present in the wires [ that's wrong but I could not be bothered being more verbose right now ].

I would not write any application which relied on serial data communications starting cleanly without any random or corrupted characters at the start of it.

I suggest you try actually writing some chars from your NET application to the UNO, and see if it works. If it doesn't work, then you have other problems to worry about. If it does work, then you can simply worry about a work-around for the problem, assuming you don't want to start trying to solve the problem by tampering with windows or your computer motherboard or whatever.

And for Larry, I am pretty sure that 14400 baud rate works on my computer, I use it a lot.

The other thing to remember, is that it is actually USB, and not merely a serial connection.

I am not a big expert on exactly how USB works. It would certainly not surprise me, however, if this is intentional behaviour by the USB scheme, when you plug something in, it sends some characters to provoke a response from the device at the other end of the USB cable, so it can determine what device is there. I am sure some other expert would know if this happens or not.

Sometimes a small delay, after Serial.begin(), before reading makes the "noise" stop happening. Try delay(100);

And for Larry, I am pretty sure that 14400 baud rate works on my computer, I use it a lot.

Yes, my computer can use 14400, but my Serial Monitor (in the IDE) does not have a 14400 entry in the dropdown. That's why I asked what rev of the IDE he was running.

14400 baud is available here using 1.0.5 under Windows. What rev of the IDE are you running and on what OS ?

Thanks for all of the replies everybody. I'm at work right now but will check on the IDE version when I get home. I have a feeling that the available baud rates in the Serial Monitor drop down menu may depend on what board you have specified in the IDE settings.

To boil things down a little further, the problem is not with the actual communication itself. I can send and receive bytes just fine between the two endpoints. It's just that some random bytes appear on the arduino serial receive buffer when the port is opened from the other end, even though no data has actually been sent by the code running on the PC. One thing I'm also going to do when I get back is closely watch the RX LED on the Arduino when the port is opened on the PC. Maybe this will provide some insight as to whether the bytes are actually being sent by the PC on port opening, or if the Arduino is just magically placing bytes on the rcv buffer. I may also try a serial port emulator on the PC... will keep you guys posted on this.

Now the reason that this is such a big problem for me is because these randomly appearing phantom bytes cause all of my serial messages to become shifted. For example, say I'm sending 16 bytes of data at a time from my .NET code. The receiving Arduino code in the loop() would be:

void loop()
{
    char buffer[16];
    Serial.readBytes(buffer, 16);
}

The issue becomes that, when the port is opened from the PC side, those phantom bytes appear in the serial buffer, and cause all data that is sent from that point to be shifted. Say I send 8 bytes of 0xAA and 8 bytes of 0xBB in the 16byte "packet" from the PC to the Arduino. Well the phantom bytes (which are usually several 0xFF's or 0xF0's) that appear in the beginning cause the first "packet" on the Arduino to read [0xFF 0xFF 0xFF 0xFF 0xAA 0xAA 0xAA 0xAA 0xAA 0xAA 0xAA 0xAA 0xBB 0xBB 0xBB 0xBB], which is shifted to the right by the 4 phantom bytes. The shift propagates into all later "packets". The only way to work around this is to employ a header/cks to wrap the data bytes and pull everything into a separate buffer. But this adds a lot of overhead that seems wasteful for working around 3 or 4 phantom bytes that appear only once when the port is first opened, and especially for a single message, unidirectional communication scheme.

In response to using the delay(100) after Serial.begin()... the problem only occurs when the serial port is opened on the PC side, so it is unrelated to the Arduino boot, setup(), loop(), or transition between any of those states. I've disabled DTR on the PC side to make sure the Arduino does not reboot when the port is opened on the PC. I also wrote a quick sketch to determine when the phantom bytes are actually appearing using the Uno's built in LED on pin 13:

void setup()
{
    pinMode(13, OUTPUT);
    Serial.begin(14400);
}

void loop()
{
    digitalWrite(13, (Serial.available() > 0));
}

When the Arduino boots, the LED remains off. I waited a good 10 seconds after it booted, and opened the port on the PC (with DTR off so the Arduino doesn't reboot). The moment the port was opened, the LED turned on, signalling that the phantom bytes were loaded into the serial receive buffer. As I mentioned before, I'm going to try this again at home and watch the RX LED during this process.

Any thoughts? And better yet, can anybody recreate this issue with an Uno and .NET or any other way?

It sounds to me as though you need start and end of message markers and to change the way that you read the incoming data so that you always get a complete message that was positioned between the markers.

That's what I was talking about with the header/checksum to wrap the message up, but as I mentioned, this seems like a lot of overhead added in just to work around 4 bytes at the very beginning of the program. Plus using work-arounds instead of solutions always feel like a splinter that I can't get out :stuck_out_tongue:

UKHeliBob:
14400 baud is available here using 1.0.5 under Windows. What rev of the IDE are you running and on what OS ?

1.5.4 on Windows 7.

In the dropdown I have: 300, 1200, 2400, 4800, 9600, 19200, 57600, 115200

I've never worried about it, because I don't have any peripherals that require any particular baud rate, and dif I did, I'd use a teminal program line TeraTerm or something I write myself.

pcdangio:
That's what I was talking about with the header/checksum to wrap the message up, but as I mentioned, this seems like a lot of overhead added in just to work around 4 bytes at the very beginning of the program. Plus using work-arounds instead of solutions always feel like a splinter that I can't get out :stuck_out_tongue:

Well, you can always just place a Serial.print() of a known string in setup, then check for it after you open the port. Once that's in you no longer have a sync problem. I haven't encountered the problem, but that's what I would do if I did.

pcdangio:
That's what I was talking about with the header/checksum to wrap the message up, but as I mentioned, this seems like a lot of overhead added in just to work around 4 bytes at the very beginning of the program. Plus using work-arounds instead of solutions always feel like a splinter that I can't get out :stuck_out_tongue:

I would not regard it as a work-around, more an essential part of a robust serial communication system. It is not that difficult to implement and would make the communications more bullet-proof all round.

UKHeliBob:
I would not regard it as a work-around, more an essential part of a robust serial communication system. It is not that difficult to implement and would make the communications more bullet-proof all round.

yea i hear you. i might as well set up some code to do that now anyway for some of the more complex stuff in the future. I'm still interested in finding out why the bytes appear though.