LTC1664 10 bit DAC code

Guys,

I’m writing some code to control an LTC1664 10 bit DAC. I have the PCB’s but don’t have the DAC yet so can’t test it out and I’m just not convinced I’m getting my head round the bitwise operations today - maybe old age is catching up!

Basically, you control the output of the DAC by sending 16 bits over the SPI interface. The first bit is a zero followed by 3 bits for an address (channel 1-4 NOT 0-3 as I would have imagined) then there is the 10 bit value (0-1023) and finally two more zeros.

Okay, so I have two bytes.

byte1 is the address shifted 4 places left plus the most significant 4 bits from the value (by shifting it 6 places right)

byte 2 is the value, anded with 0B111111 so I only have the 6 least significant bits, shifted two places left

The DAC is controlled by setting CS low (it is on A13 on my board), SPI.transferring both bytes, then setting CS back high.

My code simply sets the DAC to each value from 0-1023 as 10mS intervals. This is it …

#include <SPI.h> 

const int csPin = A13;

void setup() 
{
    SPI.begin();
    SPI.setBitOrder(MSBFIRST);
    pinMode(csPin,OUTPUT);
    digitalWrite(csPin, HIGH);
}

void loop() 
{
    for(int i=0; i<1023; i++) 
    { 
        d2aWrite(1,i);
        delay(10);
    } 
}

void d2aWrite(int address, int value) 
{
    byte byte1 = ((address<<4) + (value >> 6));
    byte byte0 = ((value & 0x3F)<<2); //0x3F = B00111111
    digitalWrite(csPin, LOW); //select slave
    SPI.transfer(byte1);
    SPI.transfer(byte0);
    digitalWrite(csPin, HIGH); //de-select slave
}

Can you see any obvious errors or have I actually got it right?

Thanks for casting fresh eyes over it. Steve.