Go Down

Topic: Due SPI question (Read 501 times) previous topic - next topic

BrotherTheo

Hi,

I am using the SPI to output 16 bits to a DAC. As I try to squeeze the max speed out of the Due I noticed there is a gap between the 8 bit bursts coming out of the SPI. The delay between bursts is about 1.2 uS. Is there a way to reduce this delay?

Here is the code:
Code: [Select]
#include <SPI.h>

// Defining DAC channel orders
#define DAC_CS 52

void setup() {
  SPI.begin(DAC_CS);
  SPI.setBitOrder(MSBFIRST);
  SPI.setDataMode(DAC_CS, 0);
  SPI.setClockDivider(DAC_CS, 21); // 4MHz
}

void loop() {
  // put your main code here, to run repeatedly:
  WriteDacChannel(0,0x0555);
  delay(1);
}

// write value to the DAC. chan = 1 tp 8, val = 0 to 4095
void  WriteDacChannel(uint8_t chan, uint16_t val)
{
  uint16_t   thisVal;

  thisVal |= val & 0xFFF;      // trim val to 12 bits and OR in
  SPI.transfer(DAC_CS, thisVal >> 8, SPI_CONTINUE);
  SPI.transfer(DAC_CS, thisVal, SPI_LAST);
}

dlloyd

Yes, there's probably more than one way, but it seems the SPI library doesn't have the necessary options. I tried SPI.transfer16() but couldn't get it to work. Setting the Chip Select Register's bits per transfer to 16 does work, but only when CS is 10. Writing your own functions for SPI and not using the SPI library will work and give full flexibility, but here's a working example that uses the SPI library for preliminary configuration only.

Code: [Select]
#include <SPI.h>

// Defining DAC channel orders
#define DAC_CS 10

uint16_t thisVal = 0x5555;

void setup() {
  SPI.begin(DAC_CS);
  SPI.setBitOrder(MSBFIRST);
  SPI.setDataMode(DAC_CS, 0);
  SPI.setClockDivider(DAC_CS, 21); // 4MHz
  REG_SPI0_CSR |= 0x80; // 16 bits per transfer
}

void loop() {
  if ((REG_SPI0_SR & 2) != 0) REG_SPI0_TDR = thisVal;
  delay(1);
}

Go Up