Voltage Divider Question

I have a 12v DC switching power supply driving LED strips via MOSFET's. Power supply is 6A. My ACS712 and Fluke register 1.26A when the strip is on.

At zero load the PS puts out 12.87v. With 1.26A load on it drops to 12.67. Not a huge drop. If I build a voltage divider to drop it to 5V so I can monitor it on my uC, do I base the resistors on the loaded or unloaded volts? Seems like a moving target.

Thanks

What I would do, just to keep the maths easy, is design the voltage divider to divide by an integer amount. In this case, probably three. So the largest voltage presented to the microcontroller will be about 4.3V. This doesn't squeeze the absolute maximum resolution out of the ADC, because it doesn't use the full 0-5V scale, but it's close enough.

Doesn't really matter but I would base it on the highest. But don't calculate the divider to output a +5vdc at normal PS output voltage as any rise in voltage could damage the analog input pin. I would design it for say +4.00 at max supply output voltage to give you some safe head room. Note that you can use the map() function to scale it to any range you want to print out.

Ok thanks Guys. I'm fighting the urge to just hack an ATX PS for the cheap amps but it's so messy. Gonna squeeze what I can out of the 6A PS first. I'm good > 11.7V but below that it gets iffy.

dotJason: do I base the resistors on the loaded or unloaded volts? Seems like a moving target.

Thanks

Depends on whether you are monitoring with a digital pin or an analog pin.

For a digital pin choose your divider to have a high enough impedance and it won't matter if the output is a bit above 5V as the input protection diode will cope. 10k or so.

For an analog pin just make sure the maximum output voltage is a bit below 5V and you won't lose any range.