I'm working through the Arduino Starter Kit lessons, and on my current lab (#4- Color Mixing Lamp), I'm trying to understand the logic behind the selection of a particular resistor.
Basically, you have a phototransistor (data sheet is here) connected to a 5V source. On the emitter side of the phototransistor, in series, is a line to the Analog In and then a 10 KOhm resistor, then finally a line to ground.
Can you explain the rationale for selecting this 10KOhm resistor? Can you frame it in terms of Ohm's law? Also, please explain the relevant details from the datasheet and how you were able to infer all this (bc I don't follow it at all).
Shows that there are two devices in SERIES, from +5V to Ground. That means the point where they joint will have some voltage between 5V and 0V, and the voltage will be proportional to their RESISTANCES.
Example: there is just enough light on the phototransistor that it's effective resistance is 10K, same as the fixed resistor. The VOLTAGE DIVIDER created by the two components will have equal voltage across each element, or 2.5V Different amounts of light will change that voltage, so you can 'measure' it and act upon it.
If you know the two resistances and want to calculate the voltage across one of them, use the formule:
Voltage Divider Vx = (Rx/RT)VT Read that "Voltage across resistor x equals (Rx/Rtotal) times VoltageTotal
Can you explain the rationale for selecting this 10KOhm resistor?
The real truth is, somebody probably chose it experimentally (trial-and-error). The required resistance depends on the light level.
If you look at the datasheet where it says Collector photo current it shows typical (mis-abbreviated) current of 60uA at a light level of 10 LUX. (They test it with a 1K resistor).
Can you frame it in terms of Ohm's law?
Transistors (and phototransistors) are "current operated" devices. 10K x 60uA is 0.6V across the resistor. With 1K and the same current you'd have 0.06V.
With 100K, the calculations say 6V which is impossible with a 5V supply so the transistor would be in saturation, you'd be getting less than 60uA, and you'd have a little less than 5V across the resistor.
10K x 60uA is 0.6V across the resistor. With 1K and the same current you'd have 0.06V.
Great- this explains how the resistor selection will affect the voltage across it.
However, just to be clear, we're also assuming the phototransistor's output current to be constant at 60uA (per datasheet). Wouldn't the current be dependent on the resistance? I'm used to understanding series circuits such that you have a fixed resistance and voltage source, and from that you calculate current. Maybe this is related to the phototransistor being "current operated"?