Completion of serial transmission - strange behaviour

There are two parts to the data stream

The host computer transmits a standard 2400 8N2 serial byte, then looks for a reply from the slave. The slave response is in the form of a pwm signal, with start bit of about 2 ms and bit time of 1.1 ms. A "0" is indicated by a pulse width of < 400 us, and a "1" of > 400 (in practice, times are about 300 and 800 us).

A host computer port can be connected to up to 4 slaves. These are effectively polled in turn using the last nibble of the 6-byte command stream as an address. Polling will rotate through all 4 possible slave addresses. If a slave is not present, this is detected because the signal line remains high rather than being pulled low. To cover this case, I am using a timer to indicate that the expected period has expired and the response is effectively "null".

I need to send a serial stream of 6 bytes, then disable the serial transmitter and turn the pin around to receive the slave response. I am using pin 18 on the Mega for communication, acting as TXD1 on transmit and INT3 on receive. The TXC1 flag in transmit mode indicates that the command stream has been transmitted completely, disables the transmitter, starts Timer3, and enables INT3. If there is no activity on INT3 before Timer3 times out, a "null" response is recorded.

Generating a command and interpreting the response is not the issue I intended to raise in my original post. As I said there, if I set my timer to 15 ms, the timeout starts in the middle of the process of transmitting the command stream - if it is reduced to 12 ms, it works as expected. My question is:

What is the mechanism that apparently allows the timer setting to affect the operation of the USART?