Timing precision after using I2C

I use Arduino Uno to receive a trigger from ohter instrument and then output a digital signal after a certain delay relative to the trigger. At beginning, I can set the minmum delay time around 4us and the digital signal rising edge has 200ns jittering if observing with an oscilloscope.

After using I2C (in order to accept data from computer or another Arduino), I found:

1) the delay between the trigger and the digital signal becomes much larger. It is bigger than 20us, I have no way to make it short than 20us.

2) the pricision of the rising edge of the digital signal gets worse. It has a jitter around 500ns. In other word, after using I2C, the timing precision I can fire the digital signal gets worse.

Is the digital signal timing precision affected by I2C? Somebody says that the I2C interupt has interference on digital signal timing. Is it true?

Thanks for help!

Yes, it is true if you do something in software. The Arduino Uno uses the ATmega328P microcontroller, and that microcontroller has many hardware items inside, for example Timers, I2C, SPI, UART. Every hardware item works independent of the rest. But as soon as software is used, for example interrupt routines, then one interrupt routine delays the other interrupt routine.

Do you have a attachInterrupt() and a delay in the interrupt ? An interrupt is delayed by other interrupts. Interrupts are executed one after another (there are exceptions though). The Wire library uses interrupts, but also the Serial library.

I have read about this problem for a flash light to make photos, I forgot what was used :( The hardware timers can be used as a one-shot delay. But I don't know if they can be hardware triggered by an external pin.