I'm using the AD7768 in a project (http://www.analog.com/en/products/analog-to-digital-converters/ad7768.html). I would like to read data from it using an Arduino DUE (84MHz ARM SAM3).
This chip acts sort of like a SPI master, except it sends out the data on eight separate lines and pulses DRDY instead of using a framing signal. The clock coming in is at 4MHz, so nominally I have 21 opcodes to work with per clock.
I have connected the input clock to pin 30, and am using pins 14-21 for the data line inputs. Data is clocked in on a falling edge of the clock. DRDY is supposed to fall when a result is about to be clocked out.
To try to make this run fast enough, I implemented an interrupt on the falling edge of DRDY, and then do this (removed all but one input channel both for my testing and for clarity):
boolean ADCDataBuffer[8][32];
void ADCDataHandler () {
noInterrupts();
while(digitalRead(30));
ADCDataBuffer[0][31] = digitalRead(21);
while(!digitalRead(30));
while(digitalRead(30));
ADCDataBuffer[0][30] = digitalRead(21);
...
and I continue in that manner in a rolled out polling system inside the interrupt with interrupts disabled after you enter it.
In between reads I'm sending the data to my computer over USB, but the data it reports to have received is not showing the correct bits. I scoped them, so I'm confident the pin is seeing the correct data. From scoping the input clock signal, it also looks quite clean with no ringing that might cause repeated triggers.
I'm rather stuck using those pins, unfortunately. I thought about trying to align the data inputs to a real physical port, but it wasn't possible.
I did some additional timing debugging by recording micros() at the beginning and end of the interrupt. If I just do:
void ADCDataHandler() {
starttime = micros();
ADCDataReady = true;
endtime = micros();
and when I print endtime-starttime I get 1-2us (!?!).
Simply reading only the first 8 bits for a single channel bit by bit with the while statements delaying each bit gives a 64us delay inside the interrupt.
If I just add while(digitalRead(30)) between starttime = micros() and ADCDataReady = true it reports 2-3us.
If I remove the while(digitalRead(30)) delay and simply have digitalRead(21) with no assignment to a variable, it is also 2-3us. If I add in the assignment with "ADCDataBuffer[0][31] = digitalRead(21)" instead, the time is similar.
Does anyone have any ideas? Simply reading a digital pin seems to be unbelievably slow, microseconds! From reading other forum posts, it sounds like it takes nearly 5us to read a digital pin on the AVR architecture too!
What is going on under the hood here? I can only assume that digitalRead is doing a whole bunch of hidden stuff because this seems like it should be a one opcode pin read and maybe a few opcodes of overhead putting the data into RAM. I'll probably end up having to drop the clock speed to make it usable, but this is so far away from fast enough... I'd have to drop the frequency down to ~3kHz to keep up at the reported interrupt run times.
The pins are configured as inputs, without pullups, btw.