Faster Analog Read?

I wonder if the use of the term resolution rather than accuracy is significant. That note discusses calibrating an ADC, it is not clear what errors an uncalibrated ADC will have if used outside the recommended clock speeds.

But the note says that calibration applies only to differential ADC mode:

For most applications, the ADC needs no calibration when using single ended conversion. The typical accuracy is 1-2 LSB, and it is often neither necessary nor practical to calibrate for better accuracies.

However, when using differential conversion the situation changes, especially with high gain settings.

I believe the note uses accuracy/resolution interchangeably:

The AVR uses the test fixture's high accuracy DAC (e.g. 16-bit resolution) to generate input voltages to the calibration algorithm.