Hi everyone,
I am trying to program the ADC161s626 (datasheet http://www.ti.com/lit/ds/symlink/adc161s626.pdf with timing diagram p.7) on basis of this code : https://www.allaboutcircuits.com/projects/arduino-spi-library-ltc1286-dac714/ (exemple 2).
Communication SPI and Arduino Mega.
My code is the following :
#include <SPI.h>
const int spi_ss = 48; // set SPI SS Pin
uint8_t byte_0, byte_1, byte_2;// First and second, third bytes read
uint16_t spi_bytes; // final 15 bit shited value
float v_out; // decimal voltage
float vref = 2.5; // voltage on Vref pin
void setup() {
// put your setup code here, to run once:
Serial.begin(9600); // begin serial and set speed
pinMode(spi_ss, OUTPUT); // Set SPI slave select pin as output
digitalWrite(spi_ss, HIGH); // Make sure spi_ss is held high
SPI.begin(); // begin SPI
}
void loop() {
// put your main code here, to run repeatedly:
SPI.beginTransaction(SPISettings(2000000000, MSBFIRST, SPI_MODE3));
// set speed bit format and clock/data polarity while starting SPI transaction
digitalWrite(spi_ss, LOW);
// write the LTC CS pin low to initiate ADC sample and data transmit
byte_0 = SPI.transfer(0); // read firt 8 bits
byte_1 = SPI.transfer(0); // read second 8 bits
byte_2 = SPI.transfer(0); // read third 8 bits
digitalWrite(spi_ss, HIGH);
// wite LTC CS pin high to stop LTC from transmitting zeros.
SPI.endTransaction();
// close SPI transaction
spi_bytes = ( ( (byte_0 & B00111111) <<10) + (byte_1 <<2) + ( (byte_2 & B10000000) >>6));
// & B00 inital 1 bit offset + one for null bit by & and shift into spi_bytes
// then we add the remaining byte and shift right to remove bit 12 = à refaire pour expliquer
v_out = vref * (float(spi_bytes) / 32767.0);
// finaly we recover the true value in volts. 1LSB = vref/32767
// 15bits ADC = 2^15 - 1 = 32767
Serial.println(v_out, 3);
delay(250);
// Delay that is fast but easy to read.
// delayMicroseconds(83);
// Delay that matches 12 khz delay time.
}
But it does not work, for 4.7V analog voltage, I obtain 3V digital voltage.
Currently, I read 3 bytes but when I look at the timing diagram, I know that is wrong, I have to read 18bits
I think that I have to do :
- start cs
- 2bits not read
- read 2 bytes
- stop cs
Problem, I do not know how to do this...
If someone can help me, it would be nice
Laurent