Okay… so I’m sitting down trying to sort out what seemed to be simple… and of course it’s seeming not to be. I’m pretty sure I’m missing some critical piece of transistor theory… so here goes.
I’m powering LED’s.
I’m using LM317 as a current source, feeding three power LED’s in series (white, 3.2-3.4v @ 300mA rated). The current sources are fed from a switching power supply set for 13v, giving 2v for the regulator and another 1 volt for the transistor. As I have them configured, I am using 4 ohms as a setting resistor, which makes the output 313mA. The regulator output is then fed to the LED’s in series, then to the collector of an NPN 2N5550 transistor. I’m using the transistor to switch to ground, for PWM. I have a base resistor of 235 ohms, which should provide 21mA to the base, which should be MORE than enough to push the transistor into saturation.
So, when the LED’s are connected with an ammeter in series with them, things get confusing.
The first reading (250mA) is understandable due to resistance tolerances and such- that’s current when measured directly to ground. I’d be happy feeding the LED’s 250mA, it’s not going to be noticable in terms of brightness in most cases. However, what I find is that if I connect the base resistor to 5v (or logic high) about 18mA flows (okay) into the base, but the measured current is only 150mA. I’m “losing” 100mA to the transistor, and for the life of me it’s not making any sense to me.
The transistor’s HFE is supposed to be a minimum of 20. What’s missing?