I am very new to Arduino programming, so there there many things I do not know or understand.
I am trying to program the I2C bus of a Linear Technology LT2305, a 12 bit ADC. This chip has two, multiplexed inputs. I am using Channel 0.
Here is the datasheet: http://cds.linear.com/docs/en/datasheet/23015fb.pdf
I am using a Linear Technology demo board and a Freetronics EtherTen.
I have had this board working using a PICAXE, but now want to use the Arduino.
I am using the Serial Monitor to display the two bytes being read.
I have attached a snip of my simple code to read one channel- file 140917_I2C code.jpg
As a test input, I am using the Vref test point the board. This is 2.500volts. The full scale of the ADC is 4096 bits- so 1 bit = 1mV. On the PICAXE, the returned hex values were 93 for the first byte, and 03 for the second byte. This is 2499 decimal.
Here also is a snip of a few lines of the Serial Monitor- 140917_I2C Serial Monitor sample.
The problems I have are that the returned values are not constant, and that, as far as I can see, they bear no relationship to the correct value. I get the same kind of results if I connect the input to ground or put the Vref into Channel 1- and change the channel address value.
I have attached two 4K7 pull-up resistors to the SDA and SCL ports. No effect. SDA is going to Analog 4, and SCL to Analog 5. Initially, the USB supplied 5 volts measured 4.8volts at the demo board (within the 4.75-5.25 V range specified), but I changed to a separate DC power supply for the Arduino, which gave 5.0volts at the LTC2305. No change.
I am suspecting my poor programming- but I cannot recognise what I am doing wrong. Any suggestions would be welcome. Thank you.