I want to calculate the gate resistor for my mosfet circuit. I've looked up a lot of posts and some people say to calculate it like this : R = V/I, using ohms law. So basically if you are turning on the mosfet with 10v and want 100mA: R = 10/0.1 = 100. My only problem is that wouldnt the resistor drop 10V?
Do you mean a resistor between pin output and the gate? That is a nonsense. Mosfet gate resistor has none influence on the drain-source current. The gate resistor should be something like few hundred ohms to limit the inrush gate's capacitive current.
Exactly. But its in an RC circuit (the MOSFET gate is basically a capacitor plate), so as the gate charges the voltage across the resistor and current through it fall, and the voltage on the gate increases.
@johnfg: specify a bit more precise what you want to do with your mosfet..
I just wanted to calculate the resistor between gate and my Arduino or 555 timer pin. I don't want it to draw too much current. I wasn't sure whether calculating the resistance with R = V/I would work.I thought that if I did something like R = 10/0.1 = 1000ohms, where my voltage is 10V, my resistor would end up dropping the full 10V. I got it now though. My only question is whether I can use ohms law (R = V/I) or I have to use another formula.
Ohms law applies, only the gate is a dead end so static-wise current is 0 ;) Because, like said, the gate is just a capacitor. Once charged, no current will flow.
Things get different when we have dynamic behavior. But unless the gate capacitance is large and the frequency high there is no problem. And in most Arduino application that's not the case so just connect it.
The gate voltage needed to pass the drain current you want is another story ;) Logic level mosfet can pass the full specified current with a logic (= Arduino) voltage but non-logic level mosfets have limited current capacity when driven by the Arduino.
What about when using something like a mosfet driver? I one of my circuits I want to drive 13.3 V into my mosfet and I dont want my gate resistor to drop more than 3.3V. If it does, my mosfet might not turn on fully. Of course 9v would be fine, but imagine I wanted 10V. What would I do?
MOSFET gate capacitance is pretty large typically, 2nF to 30nF sort of range - too high to drive from a logic signal nicely.
A 555 is pretty pokey though and should be a good driver for a MOSFET. Be sure to decouple the 555 well so its not let down by the power source.
johnfg: What about when using something like a mosfet driver? I one of my circuits I want to drive 13.3 V into my mosfet and I dont want my gate resistor to drop more than 3.3V. If it does, my mosfet might not turn on fully. Of course 9v would be fine, but imagine I wanted 10V. What would I do?
I don't understand what you mean - the gate resistor just limits transient current and speed of switching, it does not affect the voltages on the gate, the driver determines that.
[ You are forgetting the gate resistor drops 0V steady state ]
@johnfg: The formula depends on the switching frequency. Again, what you want to do with your mosfet?? Tell us please otherwise you will not get an answer..
I want to do many things, from driving a motor at maybe 25k freequencies to other higher freequency things. But if the resistor doesnt affect the voltage at the gate and I can limit the current at the gate with R = V / I , where V is the voltage with which I want to drive my gate, then Im fine :D
At 25kHz PWM you'll use a MOSFET driver and no gate resistor. The resistor affects the time it takes to switch. Switching losses are directly related to the time for switching times twice the PWM frequency...
With 25kHz you have to: 1. know the total input capacitance of the power mosfet (usually 60pF - 15nF) 2. you have to know what are the shortest on/off pulses (in case of PWM), let say 2us 3. your resistor must be of such value it charges/discharges the gate capacitance in time which is shorter than, say 10% of the 2us = 200ns
The formula: https://en.wikipedia.org/wiki/RC_time_constant
Based on the value of the resistor (and the driving voltage), you have to select a driver (as you have to source/sink the current of max value I = Vdriver/R).
Or just select the right driver with the right output resistance / output current...
Ok, lets forget about mosfet drivers and time/freequency . I just want to limit the current into the mosfet gate so my arduino or whatever my cicuit is doesnt output too much current into the mosfet gate. I just want to know if I can use ohms law (R = V/I) to limit the current, where V is the voltage I want to drive the mosfet with at the gate (5v for my UNO). My original question was if the resistor would reduce my voltage to something lower than 5V, which from what I've seen it doesnt, and if using ohms law to reduce current at the gate would actually work:
If i can use R = 5 / 0.02 = 250 ohms to limit current to 20mA without limiting voltage, I'm fine! Thank you guys!
The mosfet is NOT current driven device. It is voltage driven device. The input resistance of the mosfet is hundreds or thousands of megaohms.
In static operation (no switching), NO current flows into the gate.
So no resistor is needed.
If the output voltage of 555 is 10V there will be 10V at the mosfet’s gate for any reasonable resistor wired in between (ie. 1k or 10k, or 100k, or 1000k, or 10mega) in static operation.
Mind the threshold voltage of power mosfets could be 0.5 - 10V. No input current in static operation.
In case you are switching it (ie rising/falling edge of the gate voltage will be 10ns) the gate current could be 100A in peak… That current comes not from gate, but from the “gate capacitance” - imagine there is a capacitor wired between G and S (and D).
You have to read about RC constant in the link above.
The gate acts a lot like a capacitor. A resistor between the Arduino pin or 555 timer will charge it just as if it were an RC circuit, it will reach about 63% of the Vcc of the Arduino or 555 after one time constant, which is called Tau and is equal to the product of the resistor and the gate capacitance.
So within 5 time constants, the Gate voltage will reach Vcc. As you can see, the current drops off as the Gate charges, so less voltage dropped across the resistor as the Gate charges.
Note: It is a simplification to call the Gate a capacitor, but it isn't too important in this case.
[ Cringes at the obviously inaccurate shape of the exponential curve in those diagrams ]
Here’s a better one: http://expeyes.in/experiments/rc-transient.html
And, when the gate is at 10V, and the 555 goes to 0V, the gate capacitor discharges via the Resistor with the same RC constant into the Ground (because the 555's output is at "GND"). The current via Resistor will have an opposite direction, however..