Hello all,
I have been troubleshooting an odd problem for the past couple days, and have found out what causes it but I am not sure why.
I have a Mega with Ethernet Shield attached. I connect all serial ports from the Mega to my computer, as well as the Ethernet connection from the shield to my computer.
I wrote a C# form app to test the connections. The purpose is to send commands/messages to the Arduino.
When I send out Ethernet 1 or 2, I write the message to the Arduino which will then write the message out Serial 1 or 2, respectively, and be read on my computer's Serial in the form app (and vice versa for Serial -> Ethernet). I did this to check if all ports were working properly.
When using the app, I would send out Eth1, receive in Serial1. I would send out Eth2, receive Serial2. I would send out Serial1, receive in Eth1. However, when I sent out Serial2, I would not receive in Eth2.
I'm going to bypass a lot of troubleshooting here. If needed, I can go into detail. Long story short, I determined my C# code was fine, and that the problem was in my Arduino. I also determined that the Arduino was receiving the serial data, but would lose the Tcp2 connection. There was no difference in code with what I was doing between Serial1 and Serial2, or Tcp1 and Tcp2 code in my Arduino. So what gives?
Well, I determined that the problem was the order in which I initiated the EthernetClient objects:
EthernetClient m_TcpClientCommands;
EthernetClient m_TcpClientSerial1;
EthernetClient m_TcpClientSerial2;
I will lose connection to whatever the last client is, in this case Client2. If I did the following:
EthernetClient m_TcpClientSerial2;
EthernetClient m_TcpClientSerial1;
I'd lose connection to Client1.
As a way around, I added TcpClientSerial3, even though it doesn't do anything else in my code.
EthernetClient m_TcpClientCommands;
EthernetClient m_TcpClientSerial1;
EthernetClient m_TcpClientSerial2;
EthernetClient m_TcpClientSerial3;
Everything now works fine.
Why?