Is there a rule of thumb how to approach finding the right resistor rating when working with pulsed signals?
I have a pulsed signal of variable voltage which is at most 75V. Say I would introduce a 1kΩ resistor to that, then the current would be 75V/1000Ω = 75mA. The power going through the resistor would then be 75V * .075A = 5,625 Watts. Beefy resistor! However...
The signal is pulsed and the duty cycle is at most 4%. What would an acceptable rating be for a resistor under these circumstances? Do we spread out the 5,625W over time? Considering the 4% duty cycle, 5,625W * .04 = 225mW....
Does that mean that I could use a standard 250mW-rated 1kΩ resistor?
Yeah, it's the heat that burns-up a resistor and that depends on the average current.
Except, I'd use a 1/2W resistor because it's better if you don't push it to its limits. I've always been told that the resistor should be rated for twice the actual power dissipation. I'm sure that's not always necessary but it's probably a good rule-of-thumb, and regular low-power resistors are cheap.
The pulse timing might also be a consideration... If the pulse is 1 second long, that might be enough time to overheat the resistor.
Also, if there's a possibility of an error or failure mode where it stays-on without pulsing, that could be a consideration.
I hadn't considered that, should have added that info. The overheating scenario will never happen. Well, reasonably never
At its fastest it's a pulse frequency of 150Hz with a maximum pulse width of 260µs. I calculated it as follows: 1/150 = 6666µs pulse interval. At 260µs pulse width that's 260/6666 = 3.9% pulse width. Is that right?