According to my calculations that voltage divider would reduce 5V output from the Arduino to 4.926V at the MOSFET. That's the objection, I gather?
Indeed. Why design a circuit with
any performance restriction which can be avoided by simply doing it right (given that there is absolutely
no performance, cost or risk penalty
whatsoever of doing it right)?
Now the point here is also about the mind-set. What is the problem? Well, the problem is that in the reset phase, the Arduino outputs are open circuit. It is incidentally,
not a problem when it is unpowered as the protective diodes pull the lines down to the unpowered V
CC rail. But this is a problem because a floating gate on the FET may turn it at least partially on.
Here's the trick: Is this a fault of the FET?
No! The
problem belongs to the Arduino (ATmega), so the correct approach is to pull down the Arduino pin,
not the FET gate. If you think by
this stepwise reasoning, it is pretty obvious that the pull-down should be on the Arduino side.
Now here is an interesting muse on the matter of whether the series resistor is desirable. What could be wrong with inserting a resistor in series with the gate? Well, it is suggested that this will slow the transition of the gate voltage through the linear conduction region and having the FET only partially conducting will increase its power dissipation during the transient. Which is indeed correct. So then this argument, given the description of how brief this charging transient
is, is inherently suggesting that it is
more important to minimise the transient dissipation in the power FET, than it is to minimise the transient dissipation in the ATmega output driver. So - which is it now?