I know that the output from a voltage divider is determined by the ratio of R1 to (R1 + R2) where R1 is on the input side.
But how do you figure out what absolute resistances you should use, as opposed to the ratio?
If I want a 50% output, that formula tells me I need R1 = R2. By that, they could both be 220 ohms or 220 zillion ohms.
How do we know what general size of ohms - kiloohms - megaohms we should be using?
Normally, you want to use the highest resistance you can in order to minimize current drain.
But of course you have to temper that with experience as well as the load the divider is going to drive.
For a 1/2-1/2 divider, you don’t want something absurd like 10 megohms and 10 megohms… mere humidity and fingerprints will affect the divider.
But you don’t want a 10 ohm-10 ohm divider either because it draws too much current.
So, what to use?
First, think about what the divider OUTPUT will be connected to. If it’s a very high input impedance (10 megohms or more) device like an oscilloscope input, an op-amp non-inverting input or a DVM input, something quite high in the 100K range will work fine.
If you are driving something that has a fairly low input impedance (like an Arduino A/D input or an audio amp line input), then you need a divider “stiff enough” to not be bothered much by the impedance… something like 10K-10K or even 4.7K-4.7K.
In general, most dividers that you would use will range from 100K or so (total resistance) to as low as 1K total resistance.
The determining factor will be the load the divider sees.
Hope this helps.