<Dd> S = k B ln ⁡ Ω (assuming equiprobable microstates). (\ displaystyle S = k_ (\ mathrm (B)) \ ln \ Omega (\ text ((assuming equiprobable microstates))).) </Dd> <P> Macroscopic systems typically have a very large number Ω of possible microscopic configurations . For example, the entropy of an ideal gas is proportional to the number of gas molecules N. Roughly twenty liters of gas at room temperature and atmospheric pressure has N ≈ 7023600000000000000 ♠ 6 × 10 (Avogadro's number). At equilibrium, each of the Ω ≈ e configurations can be regarded as random and equally likely . </P> <P> The second law of thermodynamics states that the entropy of an isolated system never decreases . Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy . Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that amount so that the total entropy increases . Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states . In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy . </P> <P> Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification . For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it . The concept of entropy plays a central role in information theory . </P>

Which law states that the entropy of the universe is always increasing