Lately I've been reading about how once has to use a resistor in the data line for an DHT22 sensor when using an Arduino (since my understanding is that the ATmega is being powered with less than 5v). This is detailed in the Adafruit tutorial (http://learn.adafruit.com/dht/connecting-to-a-dhtxx-sensor). I understand that due to the difference voltage references from the chip & the sensor, one needs to use a resistor in the signal line (as also explained http://jeelabs.org/2010/12/16/voltage-3-3-vs-5/).
However, I am a bit confused about all this, and so I was hoping to ask a few questions to see if I can understand a bit better:
why does the adafruit example use a 10K resistor, and in the Jeelabs they talk about a 1k resistor? In a recent forum discussion (http://forum.arduino.cc/index.php?topic=107333.0), there was a mention of using a DHT22 with resistors with a range between 3.3K - 10K. How are these values determined?
If I was to have an Arduino Pro Mini 3.3v: a. which resistor value would I need to get reliable results from a DHT22? b. if I wanted to use a chronodot, I would a pullup resistor as well, correct? If so, how can I determine what resistor to use?
Thanks so much in advance... and I'm sorry for my "beginner" questions :)