Alright, I'm officially stumped. Everything I've found on google related to other weather station protocols have many more bits than the signal that my thermo/hygro sensor gives me.
At the beginning of every message, no matter what the temperature or the humidity reads, I have the common bits 101010110101, which as I mentioned earlier I'm guessing is probably some sort of device ID. But that only leaves 13 or 14 bits left in the stream to store a sign bit, temperature (with 1 decimal point), and humidity value.
For instance, the value I just captured, 71.2F and 39% humidity was: 101010110101 00101100111001
Here are a couple more, closer together:
70.2F 41%
10101011010100000010100101
70.0F 41%
10101011010100101100011101
That seems to rule out BCD mode, which would require at least 4 bits per digit, if there were 4 temperature digits and 2 humidity digits.
Are there any other types of common encoding systems that I should look into? Maybe this bitstream is somehow compressed, but I really have no idea how that would work. Any sort of pointers in the right direction, even if they are just a hunch, would be helpful!