I'm no expert on either computers nor programming so please bear with me:
You can run a timer at 16 MHz so you an get a precision of 1/16th of a microsecond.
But the CPU can only do one thing at the same time?
Assume that input_0 turns high at t = 0µs
Then the arduino must spend x number of clock-cycles writing the current time and input-ID into the memory before it can continue to scan the other ports. This x number of cycles will determine the minimum time difference we can measure, right?
You can use the Pin Change Interrupt feature to run an ISR when any of the pins change. As each pin goes HIGH, note the pin number and timer count and disable the interrupt for that pin. Compare timer counts to get relative timing.
I'm not familiar to either PinChangInt or ISR but I will look into it, thanks for the tip
How do I create a timer that can count every 1/16 µs. The function micros()
can only return values as a multiple by 4. The minimum time step therefore would be 4 µs.