# For an isolated system, can the entropy decrease or increase?

Isolated system: Since the matter, energy, and momentum is fixed, the total number of microstates available that satisfy these constraints is fixed/constant. So is the entropy constant? Yes, if the system is in equilibrium. No, if the system is not in equilibrium. What does that mean in terms of microstates?

If the system is in equilibrium, all these microstates are equally probable and the system visits each of these microstates over the course of time (also known as ergodic hypothesis). Therefore the each micostate has equal probability and $S=k \ln\Omega$. In such a state there is no more increase in entropy possible.

If the system is in non-equilibrium the system doesn't have equal probability of being in every microstate. In fact if the system is stuck in non-equilibrium ( e.g., a hot part and cold part in the box separated by a thermally insulating wall) it cannot access some microstates at all. Hence the system is restricted to fewer microstates. Technically, there is no unique global thermodyanmic state for the whole system and you cannot define entropy. But you can calculate entropy by summing up entropy of different local equilibrium parts (e.g., entropy of the hot part and cold part separately). Since the total number of microstates you can access is small, entropy is lesser than what it could be if you remove that thermally insulating wall letting the system equilibrate. Thus, when you equilibrate more microstates become accessible. Think of the system spreading in the phase space. Thus entropy increases. Once the system has reached complete thermodynamic equilibrium no more entropy increase in possible. All allowable microstates have been made accessible and equally probable!

Since entropy is

$$S=k_b \ln \Omega,$$

it is synonymous to a certain phase space volume $\Omega$ which we have to fix somehow. Phase space volume is something like a counter of possible states of the system (in the most simple case, the positions and momenta of all particles). This is (normally) done by setting some macroscopic parameters, like temperature, pressure, number of particles etc.

So, if you have a closed system (i.e. a box where nothing can get in or out), you can e.g. fix volume, energy and the number of particles. You end up with $S(E,V,N)=k_b \ln \Omega(E,V,N)$.

This can be understood as the number of all possible configurations inside the box which have the total energy $E$, the total possible volume $V$ and consist of $N$ particles - including all the stuff in the box on the left side, on the right side, evenly distributed, half of it moving and the other half standing still etc. $S$ is a *constant* number because $E$, $V$ and $N$ are fixed in a closed system.

The problem is that you can make no statement about the system whatsoever. But - instead of $S(E,V,N)$, you can look at other dependencies of $S$. Let's say you divide up the box into two equally sized parts, a *left* part and a *right* part. $N_1$ is the number of particles in the left part, and $N_2$ is the number of particles in the right part (also $N_1+N_2=N$).

Now, you can calculate $S(E,V,N_1,N_2)$ and *choose* $N_1$ and $N_2$. That way, $S$ can *change* because you can choose $N_1$ and $N_2$ (particles may go from the left to the right part and vice versa). The important point is that $S$ is *much* greater for $N_1\approx N_2$ than for any other values.

This means that most of the possible configurations have approximately the same number of particles on the left and on the right side. Because every configuration is equally probable, most of the time, the system will look like this. But not all the time, though - even when $S(E,V,N_1,N_2)=0$, there is exactly one microscopic configuration which fulfils the macroscopic parameters, and it will occur at some time (if the system is ergodic, but that is a different story). In the given example, the entropy will increase and decrease all the time when particles go from the left half to the right half.

But by definition, the percentage of time when the system is in a specific configuration $N_1$,$N_2$ is proportional to $\Omega(E,V,N_1,N_2)=\exp\left( \frac {S(E,V,N_1,N_2)} {k_b}\right)$ (I just thought of this last formula, please correct me if I'm wrong).

*In conclusion: The term entropy is meaningless without the parameters one wants to fix/vary.*

Funny example: Let's calculate $S(N_{\textrm{people alive on earth}})$ for the whole universe which describes the number of possible configurations of the universe for a given number of living people on earth. $S(0)$ will probably be much bigger than for any other $N_{paoe}$, which means that, most of the time, there will be no people on earth and the entropy will be huge (this is my interpretation of the the heat death of the universe). Then, they might come back and the entropy will be very small.