i'm doing some projects using IR mosfets (like the 530, 44n, 540 ...) i checked, they have the same leakage current range indicated used on datasheet.
but in off-state, i still got current at DS junction. for an LED on 5V, i put a 1.2k or 1k resistor in parralel, and it works. but with a little 12V 16W white LED bulb, accoarding to calculations, i should put a 50ohms like resistor, that will not live long lol.
so what's the solution ?
for info : i already putted a 10k resistor between Gate and Source, and nothing changes.
"Leakage" is current (not voltage) that "leaks through" when the device is "off".
If you have a load, the full 12V should appear across the MOSFET when it's off (except for a tiny voltage across the load due to that leakage).
When it's on, the voltage should appear across the load and there should be a very-small voltage across the MOSFET. The small resistance is due to the on resistance, which ideally would be zero.
If you are getting 2.5V across the MOSFET when on, it's not turning full-on. You are probably have the wrong MOSFET. You need a "logic level" MOSFET that will turn-on with 5V. (Vgs)
Normally, "high power" LEDs use a special constant-current driver. You can use a resistor for current limiting BUT you need some extra voltage... Maybe 18V (or more) for a 12V LED, allowing 6V across the resistor. 12V for a 12V LED leaves no voltage for the resistor and you will "calculate" zero Ohms. Without a resistor or current control/limiting you can't control the current and it won't work properly... You can get unpredictable results... The LED might get excess current, frying the LED, MOSFET, or power supply, or you might not get full-current and full brightness.
You also need to calculate the wattage for the resistor. Typically, the power dissipated by the resistor is in the same ballpark as the LED so you'll probably need a 10 or 20W resistor.
If you buy an LED "light bulb" from a home improvement store, yes. Or if you have an LED strip, yes. If you buy a "raw" LED, no.