Thanks for all of the replies everybody. I'm at work right now but will check on the IDE version when I get home. I have a feeling that the available baud rates in the Serial Monitor drop down menu may depend on what board you have specified in the IDE settings.
To boil things down a little further, the problem is not with the actual communication itself. I can send and receive bytes just fine between the two endpoints. It's just that some random bytes appear on the arduino serial receive buffer when the port is opened from the other end, even though no data has actually been sent by the code running on the PC. One thing I'm also going to do when I get back is closely watch the RX LED on the Arduino when the port is opened on the PC. Maybe this will provide some insight as to whether the bytes are actually being sent by the PC on port opening, or if the Arduino is just magically placing bytes on the rcv buffer. I may also try a serial port emulator on the PC... will keep you guys posted on this.
Now the reason that this is such a big problem for me is because these randomly appearing phantom bytes cause all of my serial messages to become shifted. For example, say I'm sending 16 bytes of data at a time from my .NET code. The receiving Arduino code in the loop() would be:
void loop()
{
char buffer[16];
Serial.readBytes(buffer, 16);
}
The issue becomes that, when the port is opened from the PC side, those phantom bytes appear in the serial buffer, and cause all data that is sent from that point to be shifted. Say I send 8 bytes of 0xAA and 8 bytes of 0xBB in the 16byte "packet" from the PC to the Arduino. Well the phantom bytes (which are usually several 0xFF's or 0xF0's) that appear in the beginning cause the first "packet" on the Arduino to read [0xFF 0xFF 0xFF 0xFF 0xAA 0xAA 0xAA 0xAA 0xAA 0xAA 0xAA 0xAA 0xBB 0xBB 0xBB 0xBB], which is shifted to the right by the 4 phantom bytes. The shift propagates into all later "packets". The only way to work around this is to employ a header/cks to wrap the data bytes and pull everything into a separate buffer. But this adds a lot of overhead that seems wasteful for working around 3 or 4 phantom bytes that appear only once when the port is first opened, and especially for a single message, unidirectional communication scheme.
In response to using the delay(100) after Serial.begin()... the problem only occurs when the serial port is opened on the PC side, so it is unrelated to the Arduino boot, setup(), loop(), or transition between any of those states. I've disabled DTR on the PC side to make sure the Arduino does not reboot when the port is opened on the PC. I also wrote a quick sketch to determine when the phantom bytes are actually appearing using the Uno's built in LED on pin 13:
void setup()
{
pinMode(13, OUTPUT);
Serial.begin(14400);
}
void loop()
{
digitalWrite(13, (Serial.available() > 0));
}
When the Arduino boots, the LED remains off. I waited a good 10 seconds after it booted, and opened the port on the PC (with DTR off so the Arduino doesn't reboot). The moment the port was opened, the LED turned on, signalling that the phantom bytes were loaded into the serial receive buffer. As I mentioned before, I'm going to try this again at home and watch the RX LED during this process.
Any thoughts? And better yet, can anybody recreate this issue with an Uno and .NET or any other way?