Scaling ADC input

I am using a Due for a research project, but I have run into a bit of a problem.

The Due ADC reads values from 0-3.3V. However, my inputs are -10V to +10V.

I have only a very basic knowledge of electronics, so this is way beyond me.

Does anyone know of some kind of circuit I can build to scale the voltage range from +/-10V into 0-3.3V so it can be read by the Due ADC? Just for clarity, I am looking for some type of circuit to map -10V into 0V, and +10V into +3.3V.

Links are helpful too!

Without knowing the output impedance of the voltage source its not
clear if a passive resistor network can be used.

Alternatively you need a rail-to-rail opamp in a differential amplifier configuration and
gain of about 0.16. Generate a Vref of 1.65V. Wikipedia has some info on
this configuration I believe.

There may be more than one solution (there usually is), but here's what comes to mind:

First, a 1:6 [u]Voltage Divider[/u] (2 resistors) to knock the voltage range down from 20V to 3.3V (-1.67 to +1.67V).

Then, a [u]Summing Amplifier[/u] (an op-amp and a few resistors) with a gain of 1.0 and +1.67V applied to one input (and your signal connected to the 2nd input). You'll need a 2nd voltage divider to provide the 1.67V, or you can scale the summing amplifier's gain appropriately on that one input.

The op-amp will need to be powered with a bipolar (plus and minus) power supply.

And if you need to read the full range accurately, you'll probably need to scale-down the voltage a bit more. The ADC may become inaccurate near the full 3.3V range, especially if the 3.32V power supply is a few millivolts low.

But as I hinted a voltage divider requires knowledge of the output impedance of
the signal source.