Hello All
I was wondering if anyone could help me with an issue where I am sure I am missing something fundamental
I have two Arduinos connected to each other via a UART. The master has a potentiometer connected to its ADC input that varies the voltage between 0 - 3.3V. The 12-bit ADC value is sent to a slave via the UART that I was hoping would output a value on the DAC from 0.55V - 2.75V. The master and slave code can be seen below
Master
void setup()
{
Serial.begin(115200); // initalize serial communication at 115200 bits per second
}
void loop() //the loop routine runs over and over again forever:
{
int sensorValue = analogRead(A0); // read the input on analog pin 0:
Serial.println(sensorValue);
delay(30); //delay in between reads for stability
}
Slave
void setup()
{
Serial.begin(115200);
analogWriteResolution(12);
pinMode(DAC0, OUTPUT);
}
void loop()
{
if (Serial.available() > 0)
{
unsigned int val = Serial.parseInt();
unsigned int ScaleVal = val*(2.2/4095);
Serial.println(val);
analogWrite(DAC0, ScaleVal);
}
}
When this code is running the slave's serial monitor shows 0 for 0v on the master and 1023 for 3.3v on the master as I would expect. However on the slave's DAC output I just get 0.55V.
I am trying to make it such that for a liner increase of 0 - 3.3V on the master, will be a linear increase from 0.55V - 2.75 on the slaves DAC.
I am a bit stuck, can anyone help?