To add dissipating resistance (by FET or whatever) seems a foolish act at the first glance...
And second, and third, and...
Indeed, the point is (usually) not to create the condition for which the rate of transfer of energy is maximum. See Footnote.
We (usually) want the application circuit to suck energy out of the battery until it can't give any more. That is, until its output voltage falls below our minimum requirements.
Unless the object is to heat the environment, dissipating energy with an external resistor, FET, or whatever, reduces the amount that is available for use by the real application device. In other words, we want to reduce the resistance in the leads (and connectors, etc.) going to the device to minimize the amount of energy is wasted in heating stuff up. We certainly don't want to put an additional external dissipative device anywhere.
Regards,
Dave
Footnote:
One application that might lead us to want maximum energy transfer rate is if we want to discharge the battery as fast as possible. An "active load" could sense the battery voltage and current and use that to continuously adjust itself to present an effective resistance that is equal to the (changing) internal resistance of the battery. I don't know if it would be worth it (compared with just attaching some big dissipative load), but it might be possible.
I mean I think it could be done, but I believe that typical active-load battery testers (used to characterize a battery's charge/discharge performance) don't try to achieve maximum energy transfer rate; they adjust their active load to maintain a constant current as the battery decreases during the discharge cycle.
Since most applications do not want to discharge the battery as fast as possible...well, I hope you get the point.