PressureMax is not correct, that is simply the highest value that the ADC can report.
I suggest to use two-point calibration, with (say) 20 PSI and 100 PSI as the calibration points.
The 1023 value suggests that your setup is exceeding the allowable input voltage on the ADC pin, which can fry the input circuitry. It is a good idea to put a 10K resistor in series with the ADC input, to protect it from voltages greater that AREF (usually Arduino Vcc).
Sorry I didn't know you wanted the code, please see below.
// PRESSURE SENSOR SETUP
float pressureValue = 0; //variable to store the value coming from the pressure transducer ADC0
float pressureValueA = 0; //variable to store the value coming from the pressure transducer ADC0
const int pressureZero = 200; //analog reading of pressure transducer at 0psi
const int pressureMax = 1023; //analog reading of pressure transducer at 100psi
const int pressuretransducermaxPSI = 200; //psi value of transducer being used
void setup() {
Serial.begin(9600);
}
void loop() {
delay(20000); // delay in between reads for stability
pressureValueA1 = analogRead(A1); //reads value from input pin and assigns to the variable
pressureValue1 = ((pressureValueA1 - pressureZero) * pressuretransducermaxPSI) / (pressureMax - pressureZero); //conversion equation to convert analog reading to psi
Serial.println("----");
Serial.print("PSI1 = ");
Serial.println(pressureValue1);
Serial.print("ADC1 = ");
Serial.println(pressureValueA1);
}
What jremington is suggesting is your input is saturating at 1023 since this is the highest analog conversion value.
Typically, 4 - 20mA inputs will read from 0 - 22mA. This leaves a small buffer to allow a reading under range and over range. Since you are converting the 4 - 20mA to 1 - 5V, the Mega input cannot read more than 5V.
Theoretically, your setup should work but it appears you have some inaccuracy at the max end. Your math is fine. Are you able to apply 200PSI? Are you able to adjust to known values?
I would verify the documentation on the pressure transducer to make sure what the max (or 20mA, or 5V) value should be. Next, I would see if the sensor has some adjustment for output to correct your inaccuracy.
To calibrate would simply be changing the max pressure (Scaled) value, or the min if you find the inaccuracy at the lower end. To be clear:
I would also verify the pressure transducer's min output. I rarely see transducers that read to zero. Typically a 200 PSI transducer will start reading around 10 - 15 PSI.
Also note that if you change your scaled min to anything other than zero, your formula will not work any more since it is not factoring in any offset. You could use the map() function.
You can but then your values would be off again. You should study and understand the plot he posted. His formula adjusted the 'straight line' in your formula to match what you already have.
If you want to fix what you already have then you should verify:
4mA = 1V = 0 PSI and
20mA = 5V = 200 PSI.
Then no adjustment would be necessary. Although, jrremington's solution is perfectly acceptable IMO.
That won't help. As you say, the pressure sensor is not that accurate. The procedure I used gives the best linear fit over the range of the calibration points.
Here is a slightly better fit, using more data. Replace the two calibration constants (scale and offset) in the code with those from the equation in red on the graph.
The best fit linear model of the data points towards the model jremington posted: pressure = 0.237 * ADC -44.759 and also shows that those end points don't quite fit with the line through the rest of the points.
Looking at the model, I think any pressures above 198PSI will read as 1023 on the ADC, and 0PSI will read as 188.