Chemistry - Can someone intuitively explain the reason for the units of entropy (J/K )?

Solution 1:

The units of of energy over temperature (e.g. J/K) used for entropy in the thermodynamic definition follow from a historical association with heat transfer under temperature gradients, in other words, the definitions of temperature and entropy are intertwined, with entropy being the more fundamental property. Entropy can be thought of as a potential and temperature (or its inverse, rather) as a generalized force associated with displacements along energy dimensions in the entropy potential. The (inverse) temperature is a measure of the effect of changes in the amount of energy on the entropy of the system, as the following definition suggests:

$$\frac{1}{T}=\left(\frac{\partial S}{\partial U}\right)_V$$

However if you start from a statistical mechanical definition of the entropy,

$$S=k_\mathrm{B}\log\Omega$$

then it is easy to see that a unitless definition might be just as well suited:

$$S_{\mathrm{unitless}}=\log\Omega$$

However, in the absence of the Boltzmann constant you need some other way to relate entropy and energy (e.g. heat transfer) in our conventional system of units. You can of course subsume $k_\mathrm{B}$ into the definition of a new temperature scale:

$$\frac{1}{T_\mathrm{new}}=\left(\frac{\partial S_\mathrm{unitless}}{\partial U}\right)_V$$

where the old and new temperature scales are related as $T_\mathrm{new}=k_\mathrm{B}T$. The new temperature scale then has units of energy.

Solution 2:

Entropy isn't "just" a measure of randomness. It is the only physical property that gives the universe a temporal direction, i.e., that provides an "arrow of time".

Also, a "measure of randomness" a crude way to characterize entropy. Rather, the entropy of a system is proportional (through Boltzmann's constant) to the log of the number of possible macroscopically indistinguishable microstates that a system in a given state can sample, weighted by the relative probabilities of those microstates.

And what determines ($k_B$ times the log of) how many microstates a system at a given temperature can sample? It is how much heat we have let flow (reversibly) into the system divided by the temperature at each point.

And that is why the units of entropy are energy/temperature, because it is the integral of energy flow/temperature that determines the number of available states, and it is the number of available states that in turn determine the entropy.

More precisely, the entropy of a system is the integral of the inverse temperature times amount of reversible heat flow needed to bring that system from absolute zero to its current temperature. I.e.:

$$S(T') =\int_{0}^{T'} dS= \int_{0}^{T'}\frac{\text{đ}q_{rev}}{T}$$

Tags:

Entropy

Units