Error calculation

Hello guys,

I have to design a telemetry device and i have to attach the datasheet. I have some doubts about the total error for a sensor, e.g. i have a gauge preassure sensor, wich has accuracy = 0.25%, but there are other possible errors, such as temperature, offset, full scale span...therefore, the error for this sensor is 0.25% + delta. Also i have the error from the analog input...for 10 bits, the error is 0.1% approximately if i am not wrong. So...specifically for that measurament what is the total error? I thought to add every kind of error, but now i think it's not the right way. Thanks in advance.

The best way, and really the only useful way, to determine sensor error is to calibrate the sensor against laboratory grade instruments, under a variety of test conditions.

Laboratory grade instruments also have to be calibrated, which is partly why they are so expensive.

There may be noise on the analog signal, adding to the error.
Errors due to the calculations (rounding errors).
Lots of other errors.
Calculating an error range is not easy, and indeed the best way is to do a number of measurements of well known inputs (pressures, in your case) and see how accurate your actual readings are.

Have a google of " dealing with transducer errors" ( sometimes nowe called "uncertainty" ).

See if you can find out if the 0.25% is total error band, linearity etc??

A method I used is to take the RMS of all the total errors togther to get the final measurement error.

thank for your reply guys!