I’ve seen two schools of thought about this, basically:
- Duty Cycle = 255 * (VR / VP)
- Duty Cycle = 255 * (VR² / VP²)
VR = Voltage Required
VP = Peak Voltage (5v)
I am actually switching a much higher voltage (12v) using a PWM output to turn a power mosfet on and off. This is all working but I was getting strange results. If I used a 50% duty cycle (to simulate 6V), I was consuming twice the amount of power I would have been using a straight 6V supply. Only when I ran a 25% duty cycle did the amount of power I was using match what 6V should have been using. 10 ohm load at 6V should use 3.6W of power. When running at a 50% duty cycle, I’m running the 10 ohm load at 12V for half the time. Now 10 ohms at 12V is 14.4W, so half of the time that reduces to 7.2W which is indeed double what would be used at 6V. So, a “friend” told me to use an RMS calculation to work out the duty cycle, which in this case would be 25%. Only thing is, a volt meter doesn;t agree. When I run a 50% duty cycle a volt meter does report 6V, and a 25% reports 3V, yet the load is performing at 25% like it should do at 6V. I’m just completely confused, can someone clear all this up? I’d be soooo grateful.