The calculation of the entropy of a single atom

The entropy of a single atom does not make sense per se, unless you specify the preparation. The entropy of a single isolated atom, fixed at a point, is indeed not defined – the entropy is, after all, a property of an ensemble not of a system. The entropy of an ensemble of isolated atoms prepared at a specific energy, on the other hand, is well defined (this would be a microcanonically prepared ensemble of atoms).

If you allow the atoms in the ensemble to fly around in a box (to clarify: the ensemble consists of many boxes, each containing one atom), then the dominating entropy at low temperatures will simply be that of an ideal gas (as the discrete excitation energies of the atom will be large compared to the temperature, but the translation degrees of freedom are easily accessible).

In the following this answer will only discuss the case, where the atom is fixed in space (or we do not care about the "boring" translational degrees of freedom).

While for large systems the thermodynamic observables agree in the different ensembles (microcanoical, canonical, grandcanonical), this is not true for small systems, so the entropy you assign the atom depends on the ensemble.

Grandcanonical preparation does not make sense for a single atom (there is no particle number degree of freedom). So we remain with the microcanonical and the canonical ensemble.

The microcanonical ensemble was mentioned above, but it does not describe an interesting situation with single atoms – there are no highly degenerate states in an atom (unless you allow energies above the binding energy of the atom – but then the system hardly counts as an atom). So you cannot even define a temperature (as this requires that $\partial_E S$ is a well defined quantity).

The canonical ensemble on the other hand does correspond to a physical situation – an ensemble of single atoms coupled to a thermal bath. This can occur, for example, as an idealization of an atom in a dilute gas (if we are only interested in the internal degrees of freedom). Now we can easily calculate all thermodynamic observables, given the Hamiltonian $H$ of the atom ($\beta := 1/T$ is the inverse temperature, $k_B = 1$): \begin{align*} Z &= \mathrm{Tr}\ e^{-\beta H} \\ F &= -T \ln Z \\ S &= -\partial_T F \\ E &= F + TS \end{align*}

Remarks

  1. Note, that the quantities only depend on $T$ as there is no system volume (we consider a single atom fixed in space) or particle number, therefore we cannot define the chemical potential or the pressure. To enrich the theory, one could go on to discuss the dependency on something like an external magnetic field (which is conjugated to the magnetization).
  2. For non-interacting particles these formulas can be written in terms of the energies of the single-particle eigenstates. Then one can easily derive an explicit formula for the entropy.
  3. It is a question of language, whether we say that "the entropy of a single atom" is defined. Technically, it is the entropy of an ensemble of single atoms. The entropy is after all the property of an ensemble. But we usually operate under the assumption that time averages and ensemble averages are the same – in this sense we can hand-wavingly consider the atom at different times as our ensemble of atoms and speak of the entropy of the atom.

Before talking about entropy, we need to discuss what possible states an atom can be in. I will start by the most general case that consists in considering a single-atom gas in a 3D box. In that case, the microstate of the atom is described by:

  • The definite linear momentum states $| \textbf{k} \rangle$ of the atom (that are eigenvectors of the hamiltonian)

  • Its internal state characterised by given a list of quantum numbers like $|n, \ell, m_{\ell}, s, m_s \rangle$

If there is not enough energy in the system to excite the atom from its ground state, then only its momentum state matters to compute its statistical properties, if furthermore the de Broglie wave length of the atom is small compared to the box size (or confinement size for some cooling techniques), then the atom states can be accounted for by the position of the center of mass of the atom in phase space $(\textbf{r}, \textbf{p})$.

When performing laser cooling of a gas, a big part has to do, initially at least, with a drop in the dispersion of velocity; by definition of cooling. That is because gas cooling (in the context of cold atoms) is firstly performed by taking care of the (classical) momentum dispersion by applying a force in the direction opposite to that of the gas jet/ atomic beam. That is the principle of many laser cooling or fountain-like devices at least.

Overall, if the gas density is considered unchanged during the cooling and the temperature changes from $T_h$ to $T_c$, then one can expect an entropy change per atom that is well approximated by:

\begin{equation} \Delta S = -\frac{k_B}{2}\ln \frac{T_h}{T_c} \end{equation}upon being cooled down from $T_h$ to $T_c$.

Note that I have put $k_B/2$ instead of $3k_B/2$, that is because in most of the preparations I know, molecular effusion is used to generate the initial beam of atoms. Such a beam has generally a low statistical dispersion in the directions perpendicular to the jet axis (and thus somehow lower temperatures for those degrees of freedom) and moreover, these degrees of freedom are untouched by either standard laser cooling or gravitational fountains.

Of course once the stage of trapping of the gas occurs by combining, say, lasers and magnetic fields, then one would have to add again this factor 3 but also to eventually take into account the quantum effects: discreteness of the momentum spectrum and quantum fermonic or bosonic statistics at sufficiently low temperatures.

Remark 1: While the beam is being cooled in the first stage, one could also add that there is a slight increase of entropy due to the dispersion of the gas in the direction perpendicular to the beam's axis. I haven't accounted for it in the formula I have given.