I am working on a project that uses a 12-bit ADC (Intersil ISL26321) and a 10-bit DAC (Microchip MCP4911), both on the SPI interface. I have written code to interface w/ both devices independently which works just fine but I have run into a problem when combining the two pieces of software to interface with both devices in sequence.
I have written subroutines for communicating w/ both devices but there is some sort of "cross talk" between them that is resulting in corrupt data to/from both devices.
The code as written returns ADC values between 0 and 3 on the serial monitor when the result should be 1000 (1V using 4.096V reference). My oscilloscope confirms that there is no data returned from the ADC. The clock and CS both appear to be correct.
If I comment out the two SPI.transfers in LP_Bias(), the ADC results are correct.
If I leave the SPI.transfers in and instead comment out where I call LP_Bias(i) in the loop, I get results that are about 2.5 times higher than they should be. The oscilloscope confirms that the data line is returning a larger number than it should. The clock and CS still look correct.
The swept voltage produced by the DAC also appears noisy when the ADC code is included.
I would greatly appreciate it if someone could look at the attached code and give me a hint as to what I am doing wrong.
I wanted to use port manipulation for both the DAC and ADC but was having trouble getting it to work on pin PB1. I will revisit this later.
The schematic is attached. The Arduino's hardware SS pin is used for the ADC's CNV pin and the CS is connected to the DAC's CS pin.
The ADC requires a minimum of 400 nanoseconds to complete an acquisition. Timing isn't too critical but I have never used this approach before and wanted to try something new. It works well, at least when the ADC code is used by itself.
MarkT:
It is unusual but if you look at figure 30 in the attached datasheet, you will see that the CNV pin has to be toggled between the acquisition and conversion phases.
But why the port manipulation then? Just drop it and use pinMode() and digitalWrite().
And if an acquisition takes 400ns minimum, why do you just give it 400ns then? You should wait the maximum time it should take, not the minimum... Or better, isn't there a way for the ADC to let you know when it's ready?
While timing isn't critical, I am interested in seeing how fast I can make this system run. I have tried removing all port manipulation and using pinMode() and digitalWrite() and it didn't solve the problem.
The comment in the code isn't the most descriptive. The minimum required by the converter is 400 nanoseconds but the delay as implemented is closer to 500 nanoseconds. I will make the delay close to the maximum that the converter specifies to see if that increases the accuracy of the conversion. Thank you for the suggestion.
The port manipulations were causing the problem. I went back to using digitalWrite for both chip selects and everything is working now. I must have made a mistake when I tried it the first time.