Hello,
I have a problem here, which drives me nuts. The usual
I have two measurment devices connected to the Arduino Nano, which both communicate over Uart.
The 1st one with 9600 8N1
The 2nd one with 9600 8N2
Both devices are ModBus devices.
I use two software Uarts with the CustomSoftwareSerial routines.
1st has rx on 10 and tx on 11
2nd has rx on 8 and tx on 9
As both devices are Modbus, the subroutines for both are almost identical. In fact the flow is the same, only names are different.
The devices are accessed sequentually. Means, first the Uart for the 1st device is engaged and then the Uart for the 2nd. Then again the 1st, then the 2nd and so forth.
Now the strange thing is, that the bit timing (when the Arduino transmits) for the 1st device is off. Means, in the tx transmisstion to the 1st device a bit takes 12,5usec. longer then it should.
For the 2nd device bit timing is on the spot.
Both devices respond with the correct bit timing.
As such, the 2nd device responds correctly, while the 1st one does not respond. The 1st one does not understand the message from the Arduino.
I wrote a test program, where I access the 1st device only, and there the bit timing is on the spot as well and then the 1st device works correctly.
The test program uses the very same subroutines as the real one, however, it is still long. So I can attach it and get scolded, that it's so long or I don't attach it and get scolded that I did not add my code. Not sure, what to do? Then again, the test program works and only submitting the real program for debugging would be of use.
Obviously, without code, debugging is not possible. If one, despite its size, still wants to look into it, I'm happy to provide it.
My inital hope is, that someone can tell me, why and how the bit timing in CustomSoftwareSerial can get influenced/changed at runtime?
Maybe that will lead me to the problem.
Thanks and BR