# Maximum Digital Sample Rate possible (3 digital inputs)?

Hi

So what I basically want to do is measure the time difference between three signals as fast as possible.
I have the arduino Uno (16 MHz). I'm planning to use digital inputs only (since I assume they're faster then the analog).

All three digital inputs should be low by default.

Pseudo code:

``````Scan all three inputs until one of them turns high
Memorize which input that turned high
Start a timer

Scan the other two inputs until one of them turns high
Memorize which one turned high
Memorize the time from the timer

Scan the last input and wait for it to turn high
memorize the time from the timer
momorize the input number
``````

So my question is: What is the smallest time difference that is possible to measure with an arduino Uno?

Also: If you have any idea of how to improve the algorithm, please let me know.

You can run a timer at 16 MHz so you an get a precision of 1/16th of a microsecond.

You can use the Pin Change Interrupt feature to run an ISR when any of the pins change. As each pin goes HIGH, note the pin number and timer count and disable the interrupt for that pin. Compare timer counts to get relative timing.

I'm no expert on either computers nor programming so please bear with me: johnwasser:
You can run a timer at 16 MHz so you an get a precision of 1/16th of a microsecond.

But the CPU can only do one thing at the same time?

Assume that input_0 turns high at t = 0µs
Then the arduino must spend x number of clock-cycles writing the current time and input-ID into the memory before it can continue to scan the other ports. This x number of cycles will determine the minimum time difference we can measure, right?

You can use the Pin Change Interrupt feature to run an ISR when any of the pins change. As each pin goes HIGH, note the pin number and timer count and disable the interrupt for that pin. Compare timer counts to get relative timing.

I'm not familiar to either PinChangInt or ISR but I will look into it, thanks for the tip Also:
How do I create a timer that can count every 1/16 µs. The function micros() can only return values as a multiple by 4. The minimum time step therefore would be 4 µs.

Norrlandssiesta:
But the CPU can only do one thing at the same time?

Yes. Fortunately the timers are done in hardware so you can be counting off clock cycles at the same time you are running software. You will want to read about Timer/Counter 1, 2, and 3 in the ATmega328P datasheet. To note the time you just copy the counter register.

You are right that the time it takes to handle the interrupt will limit the minimum delta-t you can measure. If you expect the events to be very close together you could keep the interrupts off and poll the inputs in a tight loop looking for changes. The Arduino can execute 16 instructions per microsecond so you should be able to get resolution of a microsecond or better.

``````static volatile byte dataBuffer;
static voltatile boolean done = false;
byte *pointer = dataBuffer;
static byte oldPins = 0, pins;
do {
while ((pins = PINB & B00011100) == oldPins);
{/*WAIT*/}   // wait for PORTB bits 2-4 (Pins 2, 3, and 4) to change
*pointer++ = TCR2;  // Note clock counter from Timer 2
*pointer++ = pins;  // Note curent state of the input pins
oldPins = pins; //  Look for further changes
} while (pins != B00011100);
done = true;
``````

You would then post-process the data buffer to see what order the changes came in. The buffer needs two bytes for each transition. An 8-bit counter (like 2 or 3) will roll over every 16 microseconds. If your experiment is likely to take much more than 16 microseconds you might need to use the 16-bit counter (Counter-Timer 1).

Thanks a lot! You just spared me lots of time. I'll tip you some bitcoins I'll reply when I got some measuring results.

Oops. I had forgotten to put in the line " oldPins = pins; // Look for further changes". Corrected above.