I'm using a MS5611 pressure sensor (datasheet) and it discusses total error band as well as Resolution RMS.
I'm fairly confident I don't understand what it means by Resolution RMS beyond RMS meaning Root-Mean-Squared as it lists the highest resolution giving an RMS value of 0.012 mbar, but also states that the error band at the highest resolution is +/- 0.5 mbar which seems a fair bit larger than 0.012 mbar.
I would like to create a deadband where I (theoretically) throw out bad reads caused by either random fluctuations or rounding errors using this RMS value but my web searches so far have not produced any useful information.
I know there are other options such as a weight filtered, averaging over a time span, or Kalman Filters, but I was specifically interested in figuring out more about RMS to either tailor make a filter or at least better understand my sensor.
Can anyone point me in the right direction for better understanding RMS in relation to a sensor or how I might use it to filter out jitter?