Is it possible to calculate how much heat dissipation and temperature rise will take place in a resistor

The power delivered to a resistor, all of which it converts to heat, is the voltage accross it times the current thru it:

    P = IV

Where P is power, I is current, and V is voltage. The current thru a resistor is related to the voltage accross it and the resistance:

    I = V/R

where R is the resistance. With this additional relation, you can rearrange the above equations to make power as a direct function of voltage or current:

    P = V2/R

    P = I2R

It so happens that if you stick to units of Volts, Amps, Watts, and Ohms, no additional conversion constants are required.

In your case you have 20 V accross a 1 kΩ resistor:

    (20 V)2/(1 kΩ) = 400 mW

That's how much power the resistor will be dissipating.

The first step to dealing with this is to make sure the resistor is rated for that much power in the first place. Obviously, a "¼ Watt" resistor won't do. The next common size is is "½ Watt", which can take that power in theory with all appropriate conditions met. Read the datasheet carefully to see under what conditions your ½ Watt resistor can actually dissipate a ½ Watt. It might specify that ambient has to be 20 °C or less with a certain amount of ventillation. If this resistor is on a board that is in a box with something else that dissipates power, like a power supply, the ambient temperature could be significantly more than 20 °C. In that case, the "½ Watt" resistor can't really handle ½ Watt, unless perhaps there is air from a fan actively blowing accross its top.

To know how much the resistor's temperature will rise above ambient you will need one more figure, which is the thermal resistance of the resistor to ambient. This will be roughly the same for the same package types, but the true answer is available only from the resistor datasheet.

Let's say just to pick a number (out of thin air, I didn't look anything up, example only) that the resistor with suitable copper pads has a thermal resistance of 200 °C/W. The resistor is dissipating 400 mW, so its temperature rise will be about (400 mW)(200 °C/W) = 80 °C. If it's on a open board on your desk, you can probably figure 25 °C maximum ambient, so the resistor could get to 105 °C. Note that's hot enough to boil water, but most resistors will be fine at this temperature. Just keep your finger away. If this is on a board in a box with a power supply that raises the temperature in the box 30 °C from ambient, then the resistor temp could reach (25 °C) + (30 °C) + (80 °C) = 135 °C. Is that OK? Don't ask me, check the datasheet.


Dissipation is just comes from the power law.

The temperature rise is impossible to predict without knowing how well the given resistor dissipates heat. It depends on what it is in contact with (heat sink or not?), what is the air flow, and what is the ambient temperature. The less well the resistor can actually eliminate heat, the higher its temperature will have to rise so that it can dissipate the wattage implied by the power law. We cannot predict this simply from voltage and resistance.

Furthermore, resistors have a temperature-dependent resistance. If the temperature rise is significant, and the coefficient is significant, it may need to be considered.