I have a humidity sensor (Honeywell HCH-1000), hooked up as described here. In short: First the capacitor (sensor) is discharged, then charged to a certain percent of full charge (5V). Once the charge times for 0% and 100% are known, the relative humidity can be calculated depending on the time needed to charge. The state of charge is measured through one of the analog pins.
Here is my problem: Other than the humidity sensor, I also have a few temperature sensors (MCP9700-E/TO) using the analog inputs, and I've increased their accuracy by feeding the 3.3V power supply to AREF. Only after a while did I realise that re-scaling the analog inputs from 0-5V to 0-3.3V also effects the readings from the humidity sensor: If it is set to wait until the capacitor gets to 99.3% of full charge, then it trips when the cap charges to 99.3% of 3.3V, not 99.3% of 5V.
I'll add that in my set-up the charge time to 99.3% is 4302 microseconds for 0% RH and 4900 microseconds for 100% RH. Charge time for the open circuit (with the sensor itself removed) is 700 microsecs with 3.3V fed to AREF. But the first time I calibrated was without AREF and then I got 12257 microsec (0% RH) and 27417 microsec (100% RH).
My question is: Should I be worried about the humidity sensor charging only to 3.3V instead of 5V? What could the effect of this be on the sensor's accuracy? Is it possible to have the temp sensor analog inputs scaled a lower AREF voltage while the humidity inputs stay tuned to 5V?