Should I Be Using a Constant Current or Constant Voltage Device for My Project?

Hi all. I'm trying to determine whether I should be using a Constant Current or Constant Voltage Device for my project. Or maybe I need both for different sub-components of the project?

Let me explain my project in more detail below

I'm testing 12 (soon to be 16) IR transmissive slot sensors to see how consistent the data is across a batch of them. They are the EE-SX1140 model from Omron. (Link here: http://www.omron.com/ecb/products/pdf/en-ee_sx1140.pdf) On one side, there is an IR LED ("emitter"). On the other side, is a phototransitor ("detector").

The Omron sensors are usually used in vending and automotive applications as binary "on/off" switches. However, I'm taking analog readings off of them using a 1k Ohm resistor. I'm trying to using them to distinguish between materials of different transparencies. When a semi-transparent material in place, a voltage is output that is proportionate to how much light is transmitted across the slot. In the picture way at the bottom of this post, you can see the yellow wires going into the analog inputs of my Arduino.

My main goal here is to get each of the 16 sensors to register approximately the same ADC values on my Arduino for the same transparent object, even with the manufacturing tolerances in the LEDs/phototransistors and resistors. (Note: I've gone ahead and gotten resistors with 0.1% tolerance. Also, I just finished some mounts that go on top of the sensors to position each object in exactly the same place on each sensor)

To get clean data, I realize that I can't just use a unregulated power supply

I think I will have to use a Constant Current device to drive the LED emitters. (Since the amount of light they output is dependent on current) I think I will have to use a Constant Voltage device on the phototransistor detector side since I want to have the same voltage (5.0V) coming in the Collector pin of each BJT.

Can you guys let me know if these assumptions are right? If not, can you walk me through any misconceptions I may have? Would be greatly appreciated.

I have tentatively selected these Constant Current and Constant Voltage devices. Any better suggestions would also be great

Constant Voltage Device: Sparkfun PRT-13032

•6-12V input voltage via barrel jack or 2-pin header
•3.3V or 5V regulated output voltage
•800mA Operating Current

Constant Current Device: LDD-300H

current device.jpg

Here's a pic of my project from a few days ago

I'm using a 191 Ohm/0.1% resistor for the LED's, and a 1k Ohm/0.1% resistor for the detectors

Everything is on hex standoffs and perf boards for ease of removal.


At these low currents a series resistor works as a constant-current source. As long as the power supply voltage is constant and the voltage drop across the LED is constant you'll get constant current.

Constant current power supplies are used with high-power LEDs because a resistor is not energy efficient. But, with regular low-power LEDs it's not an issue.

You will get manufacturing variations and you might need a pot (maybe in series with a fixed resistor) on one side or the other. Or, it might be better to calibrate-out the variations in software.

0.1% resistors probably won't help since most of the variation is in the optical sensors.

One company I worked for used similar devices to sense the presence of a paper form, and sometimes too much IR light would leak through the paper so we had to select parts... which is a terrible thing to do! I don't remember why we didn't add a pot.

I'd consider how the parts are powered in the final design. The diode current will be affected by Vcc and the resistor, and their tolerances. Similarly for the transistor. I'd use one or two programmable voltage sources, so that the behaviour under minimum and maximum Vcc can be tested. Resistor tolerance then can be simulated by modifying the voltage as well.