I am using an 2N7000 MOSFET to drive a 12v 1W LED, I am using JLed library to fade the LED on, hold on, fade off then delay.
The problem I am getting is that on the fade down as it gets close to completely off there is a little flicker.
This is all new to me, so could be that I have to add a resistor somewhere, or the MOSFET could not be up to the job (eventually the LED will have to be held in the on position for 5mins).
Could anyone point me in the right direction ?
#include <jled.h>
auto led = JLed(6).Breathe(1000, 5000, 1000).DelayAfter(3000).Forever();
void setup() {
}
void loop() {
led.Update();
}
Yes you need a series resistor or some kind of current limiting/control. With "high power" LEDs an active constant current (or controlled current) source is normally used, usually not a resistor but a resistor can work.
LEDs are "current controlled". You supply the correct current and the voltage "falls into place", basically the opposite of how everything else works...
And in any case you need "extra" voltage for the current source to work. With a resistor, 18V would be OK, with 6V dropped across the resistor and 12V across the LED.
With 6V across the resistor (and full power/brightness) the resistor will be dissipating 1/2W so you need a "power resistor" and it's good to use an over-rated resistor rather than pushing on to its limits.
The gate of a transistor can have very low resistance. You need ~ a 1KOhm resistor in between the output pin of the board and the gate. You also need ~ a 10k from the gate to ground so that any charge built up in the gate can dissipate properly.
Which is completely normal.
Our eye's response to brightness is not linear, its logarithmic.
PWM steps near 255 PWM are invisible, PWM steps near 0 PWM are not.
12-bit PWM and a logarithmic PWM scale could minimise this problem,
or don't dim any lower than a PWM value of 10 or 20.
Leo..