I'm trying to fully grok voltage dividers and one aspect I am trying to resolve is the reason to use larger or smaller resistor values despite the fact that the ratio is what matters.

An example (using this site as sanity checker:

```
[url=http://www.raltron.com/cust/tools/voltage_divider.asp)]http://www.raltron.com/cust/tools/voltage_divider.asp)[/url]:
Input voltage: 10v
r1: 10000
r2: 10000
output voltage: 5v
```

This makes sense and you can get the same result if you change the resistors to be 100. Now the obvious curiosity is what range should you prefer to use; a higher ohm resistor or lower? 10k or 100? The choice seems important when determining what type of resistor you want to use , i.e. 1/8w, 1/4w, 1/2w etc...

Another example:

```
Let's say we use the 100 ohm resistor from above hooked up to that 10v input. So first we need the current I. Using Ohm's Law I = V / R, so I = 10 / 100 making I = 0.1.
To get the watt rating (power) we again use Ohm's but this time it's P = I * V, so P = 0.1 * 10 making P = 1 watt.
```

Two resistors would therefore need to support 0.5 watts between the both of them to work properly. So when making a voltage divider with such low resistor values you'll actually want to use larger resistors that support a higher power rating, otherwise it will not dissipate enough heat and will likely fail (i.e. catch fire).

Now, if we had used 10kOhm resistors the math would work out to:

```
I = 10 / 10000
I = 0.001 amp
P = 0.001 * 10
P = 0.01 watt
This voltage divider could use resistors rated for 1/8 watt 10kOhm.
```

Am I making valid assumptions? Am I missing anything or does this all add up?

The practical consideration in all this is when buying resistors in bulk it might be fine to just buy the cheaper 1/4 watt resistors at a higher ohm rating than wasting money on the higher watt resistors (which cost quite a bit more).