Interesting question: I'm working on a specialized LED controller right now. It runs off of a 3.9V, 18650 Li-Ion battery. The LED's have an average Vf of 2.8V, with a max Vf of 3.2V
I'd like the LED's to stay at a constant brightness (as in not dim and suffer from the loss of voltage with the battery). And then at 3V (Vthresh), a voltage monitor will throw a flag which will dim the LEDs to 10% to signal the user that the battery is pretty much as low as it can get before the battery can't safely power the LEDs properly anymore.
So then here's what I'm seeing\thinking: According to the numbers, I can use a 1R resistor to get the Vf still of 2.8V even at the Vthresh of 3V. So at 3.9V, that 1R resistor will be giving me 3.7V, which is obviously too much unless I want fireworks.
So what if I was to use PWM to control the voltage being supplied to the LEDs to start and stay at "3V", and then monitor the voltage using the built in bandgap reference to adjust the PWM to compensate for the loss of voltage in the battery. That way the LEDs get 3V giving me a Vf of 2.8V. And then when the battery actually hits 3V for real, the voltage monitor flags the MCU of this, and then the LEDs dim to 10%.
I figure this way, I can ensure the brightness of the LEDs remain constant as well, which my client wants.
Just looking to bounce this off ya'll. What do you guys think? Grazie