intellijel:
I am getting very inconsistent timing for the 4 outputs. From my analysis so far it seems that the ISRs are causing the random delay of the output signal changes.
Do you have figures?
I have some stuff about interrupts here:
It takes about 1.5 uS to enter an ISR (more for the built-in ones for external interrupts). But you are quoting detecting changes that happen every 2 mS, which is a lot longer time than that.
As I discuss on that page, interrupts are likely to have some jitter, simply because you can't interrupt in the middle of an instruction, so you have at least something like a 1 or 2 clock cycle variation. Also, interrupts can't occur if you are processing an interrupt, so if you are processing (say) a timer interrupt, your pin change interrupt could be delayed.
I would have to see your code to comment further, there might be efficiencies that can be added.
I had a recent project where I wanted to generate VGA video signals, and even one clock cycle added too much jitter. In that case putting the processor to sleep helped get consistent results.
I think you need to quantify your expectations. How much jitter can you tolerate, and how much are you measuring?