I am having an issue with an arduino atmega168 circuit interfaced to a TLV5638 2 channel 12bit DAC.
Issues:
-
When looking on an oscilliscope I can see the output always has a 105kHz 200mVpp square wave riding on it. The amplitude of this square wave seems to be proportional to the voltage output. e.g. if I set a max voltage of 2.048V, then the square wave riding on this DC offset has a much larger amplitude.
-
For the most part the seperate output voltages (output A and output B on the DAC) are stable but there seem to be certain combinations of values that result in creating a jittery output on both channels.
-
In order for the SPI transfer to work in the code, it has to be sent twice to the DAC (I know other people have documented having to do this)
So do I need some kind filter on the output to get rid of the 100kHz squarewave?
Any ideas what might cause that strange jitter problem?