I have a three wire 5v ebay pressure transducer rated to 1.2MPa (174psi) and am looking to verify that the code I have is correct. There seems to be a decent readout on the display, but I am unsure if this is the best way to program this sort of input.
The object is to have a readout in psi, I have this done by scaling an analog input as the transducer puts out a voltage based on it's readings.
For now it displays ~17psi on the readout ambient. I'm unsure if this is because the transducer reads ambient pressure or if it can have a relative zero at ambient?
I think in your case, if the sensor delivers 0.5 volts to 4.5 volts to represent 0 to 175 psi and if vcc = 5 volts, then outputP1 = ( inputP1 - 102 ) *175 / 818 (if my school day maths has not let me down too badly).
aarg:
Well, it's better to calibrate a sensor than to assume, anyway.
I'm looking to have the sensor set up in a test rig once I get the code sorted out to see how accurate it really is. I have two of these and there's a variance of 2psi at ambient pressure already.
I'm glad it seems to give a reasonable answer.
The problem is exactly like converting Fahrenheit to centigrade for water temperature.
The Fahrenheit scale goes from 32 to 212 and centigrade goes from 0 to 100 so the formula is C= (F-32) * (5/9)
The (5/9) comes from 100/(212 - 32).
In your case, the analog reading goes from 0 to 1023 for (in this case) a 5 volt supply. Your sensor delivers 0.5 volts for 0 PSI and 4.5 volts for 175 PSI according to the data sheet.
0.5 volts is an analog reading of 102 (rounded) for 0 PSI
4.5 volts is an analog reading of 921 (rounded) for 175 PSI
so the formula is Pressure (PSI) = ( Analog Reading - 102 ) * 175 / ( 921 - 102 )
which, allowing for rounding errors, is where my 818 came from.