# Why isn't temperature measured in Joules?

One reason you might think $T$ should be measured in Joules is the idea that temperature is the average energy per degree of freedom in a system. However, this is only an approximation. That definition would correspond to something proportional to $\frac{U}{S}$ (internal energy over entropy) rather than $\frac{\partial U}{\partial S}$, which is the real definition. The approximation holds in cases where the number of degrees of freedom doesn't depend much on the amount of energy in the system, but for quantum systems, particularly at low temperatures, there can be quite a bit of dependence.

If you accept that $T$ is defined as $\frac{\partial U}{\partial S}$ then the question is about whether we should treat entropy as a dimensionless quantity. This is certainly possible, as you say.

But for me there's a very good practical reason not to do that: temperature is not an energy, in the sense that it doesn't, in general, make sense to add the temperature to the internal energy of a system or set them equal. Units are a useful tool for preventing you from accidentally trying to do such a thing.

In special relativity, for example, it makes sense to set $c=1$ because then it *does* make sense to set a distance equal to a time. By doing that, you're simply saying that the path between two points is light-like.

But $T=\frac{\partial U}{\partial S}$ measures the change in energy with respect to entropy. Entropy and energy are extensive quantities, whereas temperature is an intensive one. This means that it doesn't very often make sense to equate them without also including some non-constant factor relating to the system's size. For this reason, it's very useful to keep Boltzmann's constant around.

My personal favorite way to do it is to measure entropy in bits, so that $k_B = \frac{1}{\ln 2} \,\mathrm{bits}$ and the units of temperature are $\mathrm{J\cdot bits^{-1}}$. Having entropy rather than temperature as the quantity with the fundamental unit tends to make it much clearer what's going on, and bits are a pretty convenient unit in terms of building an intuition about the relationship to probability theory.

Temperature cannot be measured in units reserved for energy because, for instance, a grain of sand heated to the temperature as the Sun does not contain the same amount of energy as the Sun.

Temperature is the property that, when two bodies in thermal contact have the same value of it, no net heat flows from one body to the other: they are in thermal equilibrium.

If you try to equate this property with energy, and thus assign it Joule units, it isn't physically correct.

Two bodies that do not exchange heat due to being in thermal equilibrium are not isoenergetic. One could contain way more energy than the other.

But in fact, temperature is related to energy. In a (very simplified, idealized) particle model of a body of matter, temperature indicates the average kinetic energy of a particle of that body. That also directly tells you that it cannot have energy units: a phenomenon which corresponds to a density measure of energy (energy **per unit** of mass or volume, or per particle) cannot be measured in energy units.

I've seen temperature being expressed in electron volts (eV) in Plasma Physics. Basically, you can equate $k_B T = e y$, where $y$ is the temperature in electron volts, and $T$ is the "thermal" temperature in Kelvin. $e$ is the quantum of charge and $k_B$ is the Boltzmann factor. So $1\mbox{ eV temperature} \approx 11600 \mbox{ K}$. ($y$ was set to 1 to obtain the expression). I guess it is convenient in Plasma Physics because of the energy scales involved.