That datasheet gives two operating conditions that they user can decide to operate at, 1% or 10% distortion. Hi-Fi amplifiers were typically those that would add no more then .1% total distortion or less, often way less.
Okay, but how does one "decide" to operate at a specific level of distrotion? There are two levels listed for the same input voltage and the output power. There must be another variable by which you adjust this.
Also, I do want to understand how all this works, but the main thing I'm really concerned about is whether or not supplying 5v into the amp is going to blow a speaker like this:
It says its rated for 2W, max 2.5W, but that graph with the 2.75W at THD 10% concerns me. And I am even more concerned now that it sounds like you guys are saying the power rating is for a best case of a sine wave, and that even more power will be output if something like a square wave or, worst of all, white noise, is played.
My concerns are probably unfounded, but if the amp will actually be putting out 2.5W into a 4 ohm speaker, I'd like to know. That could be both good and bad. I mean more power = more volume, but not if it'll end up blowing most small speakers.
Also, I know there will be some voltage drop on that dac and the amp with my filtering. I dunno how much yet though. But I'm guessing it will probably drop the output from 2.75W to 2.5W or less. Still, wondering what the worst case here is.