Source of the post
Entropy in a system will always increase, as per the Second Law of Thermodynamics. Increasing entropy overtime in a system is the natural result of that system existing.
Entropy of a closed system can also be constant. Example: An expanding universe filled with photons is isentropic,
or has a constant entropy. Why? Because expanding a sea of photons is a reversible process -- no useful energy is lost and the system could evolve the other way just as easily. To be precise, the entropy of a sea of photons is proportional to its volume times its temperature cubed. But its temperature is inversely proportional to the size (scale factor "a" of the universe), while volume is proportional to scale factor cubed. So the entropy is proportional to VT3
The layman's interpretation of the 2nd law really should not be that "entropy of a closed system always increases", but rather that "entropy of a closed system cannot decrease." Actually, even that is wrong. It can decrease, too! It's just vastly less probable
for it to decrease, especially if the size of the system (and number of particles it contains) is very large.
To make sense of this, and what entropy means
in a rigorous statistical mechanics sense, I like to use an analogy with coins, such as here where I also use it as a way to understand the concept of negative temperature.
For a bunch of coins, the entropy is the logarithm of the number of ways the individual heads and tails could be arranged (the microstates of the system) to yield the same total number of heads and tails (the particular macrostate of the system). Then the 2nd law here states that if you have a large number of coins, and start flipping them randomly, then you are statistically more likely to trend toward the macrostate that has the largest number of microstates. That would be half of them being heads and half of them being tails.
There's nothing mysterious about why the system evolves in that direction. There are simply many more ways for it to evolve in that direction! If all ways have equal probability (this is the "fundamental assumption of statistical mechanics"), then you expect the system to evolve in the direction that has the most ways of getting there, and thus increases the entropy. Getting all heads or all tails out of coin flips isn't impossible, it's just very unlikely if you have a lot of coins.