My understanding of MOSFETs is somewhat simplistic, but as I understand it, the gate is almost totally isolated from the source and drain, and I can think of it as voltage controlled, rather than current controlled. I.e when the gate voltage goes over the threshold it turns on, otherwise its off. Just what I want.
So, my question is, is there any need for a gate resistor when driving it from an Arduino output? If no current flows into the gate, apart from charging up its small capacitance, then do I need one?
If there is a risk of damaging the microcontroller or MOSFET, then how real is that risk? I have used MOSFETS in this way lots of times without a problem, I just feel like I might be doing something 'naughty'.
Yes, the gate resistor is necessary. It is exactly that current that charges up the small capacitance that is the problem. The "small" capacitance is often on the order of nanofarads, while the digital output of the microcontroller is designed to drive other digital outputs, which are on the order of picofarads. Thus, a MOSFET gate "looks like" 100-1000 little digital inputs all in parallel, all hungry for a little bit of current.
The risk of damaging the microcontroller is real. The short burst of current can, over time, degrade the digital output and cause it to fail. The MOSFET will be no worse for wear, unless in the process of the digital output degrading it fails to turn the MOSFET on fully, the MOSFET starts operating in its linear region, then overheats and fails too.
If you really want a sense of the problem, put a 1 ohm resistor in series with the gate of a relatively-beefy MOSFET, such that its gate capacitance is at least 1nF. Use an oscilloscope to measure the transient voltage across this 1 ohm resistor during turn-on and hence estimate the transient current. Chances are very good it will be exceeding the maximum current rating of an I/O pin. You'll get away with it....for a while.
--
The Rugged Motor Driver: two H-bridges, more power than an L298, fully protected