how much time would elapse between the state changes and the pin changing HIGH/LOW?
I've read a few times on this forum that latency interrupt around 3.5 usec, it's time required uCPU to finish current instruction, and push stuff in stack. This time would vary depends on this "current" instruction 1 - 4 cycles, 1/16 MHz or 62.5 nsec per cycle, so jitter expected be 62.5 to 250 nsec. Also shouldn't be interference with other interrupt subroutine, as it would delay much more, depends on the code inside interfering ISR function. My own experience with video synchro - extraction, shows pretty stable results, jitter is only 1 or 2 cycles, so answering on next question:
How consistent woud the delay be between executions? Are we talking microseconds, nanoseconds?
Very consistent. Make sure, there is no "racing" interrupts, avoid digitalWrite, digitalRead, analogWrite etc, as they use cli(), sei() unreasonably often (IMHO).
Also, do the hardware interrupt pins use a timer, and does the timer frequency affect the response time?
No , timer is "independent" hardware.