Pulse withstanding capability of standard resistors

Any half-decent reputable resistor supplier will have pulse power dissipation limits such as this one from Vishay: -

enter image description here

It tells you how much power can be delivered in a pulse to a resistor. For instance, an 0603 resistor can take pulses of power up to about 20 watts if the duration is only a micro second. For a milli second the power can be no more than 1 watt.

Or is there any way to tell whether a 0402, 0603 or 0805 standard resistor (without further specification) will suffice?

If you want to do the job properly read the data sheets. The data sheet will also tell you how much copper you may need to use around the resistor to achieve this specification so, you can't really guess with any accuracy.


While I agree with Andy Aka's answer, if you cannot get a decent data sheet, I'd estimate based on the following:

The enemy of reliability is the heat load on the resistor, (temperature perhaps).

Based on that, you know how much energy is being transferred to the caps via

$$E=\frac{CV^2}{2}$$

Hopefully you can find a mech-eng for back of the envelope air cooling based vs $$I^2R$$ and your energy per pulse.

You could also do IR inspection on a prototype- but if you have access to IR equipment - why not source a resistor with a decent data sheet?


The resistor needs to withstand the maximum average power dissipated.

That can be a complicated equation, but basically it means the more often you will be switching the MOSFET, either continuously or in bursts, the higher the wattage of resistor you need.

If it's a simple switch to turn on a relay or lamp it won't matter.

If however, you are pulse width modulating some coil current at hundreds of hertz or kilohertz, the charging currents become more sustained and the power dissipated by the resistor matters.

Note I also mentioned sustained bursts. If you periodically fire a few thousand pulses for a significant duration, you need to use those values as your worst case scenario.