An amplifier's total output harmonic distortion is just a measurement of how much an amplifiers output deviates from the original input signal's waveform. That is how much distortion does the amplifier add to the input signal. No linear amplifier is perfect, reality has yet to come up with the long sought after 'wire with gain'. If you drive an amp too close to it's power rails or cause more then it's current capacity can supply the output waveform (say by trying to drive a speaker with too low an impedance value) will deviate from the input waveform, thus increasing it's output distortion value. It's all a trade-off that the designer has to deal with.
That datasheet gives two operating conditions that they user can decide to operate at, 1% or 10% distortion. Hi-Fi amplifiers were typically those that would add no more then .1% total distortion or less, often way less.
Passing a square wave through a audio linear amp is one method to test it's upper linear frequency response as a square wave is a fundamental sine wave plus all it's odd harmonics and as the amplifier does have a specific upper and lower frequency limit the output waveform will deviate from a true square wave at some input frequency. Max power output, max total harmonic distortion, and flat frequency response are interrelated 'goodness' rating for any specific amplifier.
Furthermore, are you saying that at 5v with a 4ohm speaker, if I output a square wave then I will in fact be outputting 6.25W and 1.25A as I originally calculated?
In theory only. In reality the output transistors in the amp won't allow the full +5vdc to pass through to the speaker, there is always some losses. But even though closer to '6.25' watts those wouldn't be sine-wave watts which is what audio power ratings are specified at, but rather some kind of 'peak' power watts, and then you would be comparing apples to oranges.
Lefty