Why would a black hole explode?

The expression for the power emitted as Hawking radiation is $$ P = \frac{\hbar c^6}{15360 \pi G^2 M^2} = 3.6\times10^{32} M^{-2}\ \text{W} = -c^2 \frac{dM}{dt},$$ where the term on the far right hand side expresses the rate at which the black hole mass decreases due to the emission of Hawking radiation.

You can see that what happens is that the power emitted actually increases as $M$ decreases. At the same time, the rate at which the mass decreases also increases.

So as the black hole gets less massive, the rate at which it gets less massive increases rapidly and hence the power it emits increases very, very rapidly.

By solving this differential equation it can be shown that the time to evaporate to nothing is given by $$ t = 8.4\times10^{-17} M^3\ \text{s},$$ so for example a 100 tonne black hole would evaporate in $8.4 \times10^{-2}\ \text{s}$, emitting approximately $E = Mc^2 = 9\times 10^{21}$ joules of energy as it does so – equivalent to more than a million megatons of TNT. I guess you could call this an explosion!

This will be the fate of all evaporating black holes, but most will take a very long time to get to this stage (even supposing they do not accrete any matter). The evaporation time is only less than the age of the universe for $M < $ a few $10^{11}\ \text{kg}$. A 1 solar mass black hole takes $2\times10^{67}$ years to evaporate.

EDIT: The Hawking radiation temperature is given by $$ kT = \frac{\hbar c^3}{8 \pi GM}.$$ Unless the temperature is well above the ambient temperature (at a minimum the cosmic microwave background temperature), the black hole will always absorb more energy than it radiates, and get bigger. i.e. to evaporate $$ \frac{\hbar c^3}{8 \pi GM} > kT_{\rm ambient}$$ $$ M < \frac{1.2\times10^{23}}{T_{\rm ambient}}\ {\rm kg}$$

Therefore unless I've made a mistake, this proviso is of no practical importance other than for evaporating black holes (i.e. those with $M<10^{11}$ kg) in the early universe.

The temperature of a black hole goes as its evaporation timescale as $t_{\rm evap}^{-1/3}$. The temperature of the early, radiation-dominated, universe scales as $t^{-1/2}$. Thus it appears to be the case that at some point in the past, a black hole that might have had an evaporation timescale shorter than the age of the universe is incapable of doing so.


Unlike most objects, a black hole's temperature increases as it radiates away mass. The rate of temperature increase is exponential, with the most likely endpoint being the dissolution of the black hole in a violent burst of gamma rays. A complete description of this dissolution requires a model of quantum gravity, however, as it occurs when the black hole approaches Planck mass and Planck radius.

Wikipedia

All black holes are theorized to emit Hawking radiation at a rate inversely proportional to their mass. Since this emission further decreases their mass, black holes with very small mass would experience runaway evaporation, creating a massive burst of radiation at the final phase, equivalent to a hydrogen bomb yielding millions of megatons of explosive force.

Wikipedia