I have an infrared LED and infrared phototransistor and want to use them for a line follower so wanted to know how I should hook it up. The infrared LED say its foward voltage is 1.2v and foward current is 100ma so what kinda reistor would I need for it and wouldn't that be cutting pretty close to the arduino ma limit. Do I need a resistor for the infrared phototransistor too.
How do you hook up the phototransistor. It have an emitter and a collector pin. I figure one would have to go to an analog pin but don't know about the other. Also do the IR LED go to a digital pin and one to ground.
Time to give this another outing.
Leave the switch out and hook an analog or digital pin up to the test point. The resistor values are deliberately vague, its suck it and see time. The test point is high when the LED isn't on.....
I suggest a 220 ohm resistor on the LED and a 10k ohm resistor on the photo-transistor to start off with. Also try 330 and 470 ohms on the LED and/or 1K and 100K ohms on the transistor.
In both cases, go for the highest ohms that still function properly.
A lower value resister on the LED, increases the brightness of the LED thus increasing the sensitivity, too low will burn out the LED and/or arduino power ,a lower resister value on the phototransister decreases the voltage drop for a given light level reaching it thus decreasing the sensitivity. On the other hand, a lower value will increase its immunity to noise. Suck it and see.....
how do you figure out the resistors for the IR led and phototransistor?
The infrared LED you get the maximum current from the data sheet and choose an appropriate resistor based on that, The phototransistor you suck it and see - try a value and hook it up to a digital multimeter and see if you get appropriate voltages under the conditions you want it to work under. A nice swing from around 1V to 4V works nicely with a digital pin. There no hard values because it depends on the optical arrangements as much as anything.
@demonic_crow:
Resistor values are easily determined. Let's say you wanna reach the 1.2V on your IR LED. That means, as resistor and LED form a series connection, that you will have to "burn" 3.8V on your resistor (5V supply - 1.2V forward voltage from the LED = 3.8V). Now if the LED should have 100mA, you just divide voltage by current, so it's 3.8V/0.1A = 38Ohm. So the optimal resistor for an LED with 1.2V forward voltage and 100mA of recommended current at 5V supply voltage would be 38Ohm.
In fact, as the Arduino only supplies 20mA per pin, you should divide by 0.02A (=20mA), which gives 190Ohm.
By the way, you shouldn't forget to watch your resistor wattage. At 100mA there's 3.8V * 0.1A = 0.38W of heat produced in your resistor - too much for a standard quarter-Watt resistor. At 20 mA this is no problem (3.8V * 0.02 A = 0.076W).
thanks that make alot more sense now. I all ready knew about the 40ma per pin so it doesn't matter if I figure it out for 20ma or 100ma then? What would be the difference and what should I do. I was going to just do 20ma since most LED are 20ma. Also the phototransistor resistor is a bit confusing. How do you know what the right one for it?
As I tried to say in my last post, you determine it experimentally with the thing optically arranged as its going to be in the finished item. Don't go below 1k and it won't damage anything, you can use a digital multimeter to measure the output voltage whilst you're testing. Higher values give a greater voltage swing. 100k - 1M is a good place to start
Forget the arduino and theory, wire it up on a breadboard and try it.