Hello, I am implementing a straightforward IR emitter / receiver pair for short range rangefinding and have come across some strange behaviour.
I power the receiver from the board's 5V line, grounding it via a resistor to the board's GND, forming a voltage divider. I measure the voltage between the receiver and resistor to get the receiver's response. When measuring via multimeter, I get wonderfully variable readings, covering most of the 5V range.
So far so good. However, when I connect the board's analog pin, things get weird. I get beautiful readings as long as the multimeter is connected. The moment I disconnect the probes, the reading goes close to 1000+ and hovers there.
What is happening here? One theory I have is that the overall resistance of the divider is too big. I have measured that the receiver's resistance drops from astronomical values to around 50k at its low end. I have used a 530k resistor so that I get a nice range of values (receiver completely dominates when its resistance is high, resistor mostly dominates when it is low). However, it might be that the trickle of current going through the divider can't hold the voltage once the nasty ADC line joins the party.
I would think that I 'll have to compromise and lower the overall resistance of my divider, even if it means losing my excellent value range. However, why do things work so beautifully when the multimeter is connected? Obviously it does something wonderful and I 'd like to replicate that without having to tape the multimeter to the board. Any ideas what that something might be?