Sensor voltage across a resistor


I want to read analog voltage from a sensor that is designed to drive a 0-50ua dc meter. To get an input of 0-5vdc at the arduino, would I just use ohms law (R = 5/.005) and put a 10k ohm resistor from the input pin to ground? Seems too simple!

Indeed, too simple. What is the output impedance and voltage compliance of the sensor?

For instance is it a genuinely constant current output with rail-to-rail compliance? Likely not, it probably is a voltage source into a series resistor already - in otherwords it may be designed to work with a particular meter with a particular resistance (usually in the 100's to 1000's of ohms range for that sort of analog meter).

Get out the multimeter and measure the output voltage.

Then place the multimeter in series with 1k and measure the output current and compare....

Thank you!

If you want to read voltage on an analog pin, then: First, use your volt meter, and read the device (without the arduino). Check its high and low range. Then you can decide how to feed that into the analog pin. If it is high voltage, you will need a voltage divider. If it is lower than 5 volts, maybe not.

You may need a load resistor on the device.