I am planning on using an IRLZ44N MOSFET for a project and need to clarify if my assumptions are correct in regards to the requirement for a heatsink. I came across a formula somewhere down the rabbit hole which stated that the maximum Power dissipation without a heatsink I think was as follows:
Pd = (Tj - Ta)/RthJA
Assuming the Tj in question is the maximum temperature of the junction, the Ta is the ambient temperature, then:
Pd = (175 - 25)/62 giving 2.4W
so if P=I^2R and the Rds(on) is 0.028 ohms, I=sqrt(2.4/0.028) = 9.26A
My thinking is that as long as my current draw is less than 9.26A (likely no more than 5A) that I do not require a heatsink! Is this assumption correct?
The plan was to trigger the gate with an Arduino pin made HIGH, with a resistor to limit inrush current.
It depends ony many factors. the voltage you are switching. The hotter the mosfet the higher the RDSon. the slower your switching speed the longer you have a high voltage drop across your transistor, the more power it dissipates. check the datasheet on how many degree the mosfet will heat up per watt power dissipation and define your max. temp you want to allow.
Your math is correct, however these are not precise characteristics. And only apply if the MosFet is in "free air" (so handing on a board with absolutely nothing to block airflow. There is also the assumption that some heat leave via the soldered leads.
Also:
if you look at the Rds(on) vs temperature the "typical" value is 2.5 times the 0.028 advertised.
to get the Rds(on) you must but at least 10 volts on the gate.
I'm looking at the Infineion (aka IR) spec sheet. It does state the max resistance using 5V at the gate and 25 °C is 0.025 "ohms". However most of the graphical data is at Vgs = 10V.
Also the Arduino can only output something like 4.8 volts at a DIO pin.
But staying with the 0.025 ohms, the resistance vs temperature is indeed normalized. So if the graph data shows ~2.2 at 175°C then the resistance is in the order of 0.025 * 2.2 = 0.055.
In my experience the I²R losses are low compared to the switching losses.
Switching losses are a function of how often you switch and how fast you can switch.
With a 20kHz switcher the gate is usually driver by an amp or so (from 12v) to get reasonably switching losses.
If you are at the 900Hz or so you can get by with less.
Now every time you switch, the dissipated goes from 0 to 15 to 2.4 watts.
The 30 watts is for a short time but is a big number. The 30 comes from:
Assumed you have 12V and 5A, 1/2 way through the switching time the voltage across the MosFet is 6V and 2.5 Amps (assuming a resistive load, higher if an inductive load or capacitive load)
Keep in mind that the data sheets for electronic part are developed from specific testing that can be done on a sample of devices. These measurements are a guide not a recipe. For instance, the max Rds(on) is measured for a very short period of time (<300µs), not a useful number by itself.
Also running a MosFet at 175 °C is not recommended. Think of running your car at the Red Line RPM of the engine. It is the Red line, everything lightly under the Red Line is OK right?
Yes it is a TO220 package and using a small heat sink would be the safest thing to do. I am just trying to wade through this minefield a bit to get some "rule of thumb" ideas without getting a degree in electronics! Trial and error could be costly, but at least with some basic guidelines and advice, I have half a chance at success!
Struggling to understand the "Typical Output Characteristics" graphs! If I look along for the Drain to Source voltage, say 10V and go up to the line for my gate voltage which is 5V, I read a Drain current of 80A? I get the feeling I'm missing something here!
What you don't like on this? With 5 Vgs you need 10 V to pass 80 A (the MOSFET will dissipate 400 W and blow soon if the power is not reduced).
When you use it to "short" a 10 V power supply it will drain 80 A - again, not for long. There should be SOA (safe operating area) graph showing how long pulse is allowed.
80A for a Vgs of 5V is when it is switched ON hard at the specified Vds. At this current the only limiting factor is the device's Rds(on) which is of the order of a few milliohms - the datasheet will tell you that. I^2*Rds(on) will tell you how much it will dissipate.
I found something of reading the graphs and I think I understand now? The x - axis is the volt-drop between the drain and the source for a given current, so if I pick the 5V line for Vgs, then choose my drain current on the y - axis then draw a line down to the x - axis, that gives the volt-drop across the MOSFET. I can use that to calculate the Ads ON resistance. So for my possible 5A load and using a Vgs of 4V, the Vds is about 0.14V so Rds = 0.14/5 = 0.028 Ohms at 25 degrees C, at 175 degrees C it is 0.05 Ohms.
Am I on the right liners here? If so the next thing is to get my head around the switching losses!
So these figures are with a voltage connected directly to the Drain of the MOSFET? So if I have a load connected that will drop voltage and limit the current?
No Vds is the voltage across the drain and source when the 80A is passing through it and the load connected between its drain and the supply. If the device is saturated (in BJT speak) i.e., turned ON hard, its Vds will be very low. And the power dissipated by it will 80×80×Rds(on).
In a switching application that's not interesting information because its not in the on-state or the off-state. In fact any "typical" graph is pretty useless for circuit design, because the error bars aren't given (and they are often very large for FETs)
The only specs you can safely use are the minimum or maximum ones that guarantee performance parameters across temperature and device-spread.
You need to appreciate the specification that says
"Rds(on) = x for Vgs = y"
If the device says Rds(on) <= 0.1 ohm for Vgs >= 4.5V, for instance, that tells you most of what you need to know about the on-state. The off-state is Vgs=0V, and you'll see a leakage current spec that shows how "off" off actually is.
On the data sheet, in the Thermal Resistance Ratings table, there is a parameter Maximum Junction-to-Ambient (RthJA) which states a maximum of 62 degrees C/Watt. Is that saying the the temperature of the MOSFET will rise by 62 degrees C for every Watt dissipated across the MOSFET as opposed to every Watt of the rated load. I found a formula (in the first post) which I think is starting to make sense!