<Dd> S = k B ln ⁡ Ω (assuming equiprobable states). (\ displaystyle S = k_ (\ mathrm (B)) \ ln \ Omega ~ ~ (\ mbox ((assuming equiprobable states))).) </Dd> <P> This is consistent with 19th century formulas for entropy in terms of heat and temperature, as discussed below . Boltzmann's constant, and therefore entropy, have dimensions of energy divided by temperature . </P> <P> For example, gas in a container with known volume, pressure, and energy could have an enormous number of possible configurations of the collection of individual gas molecules . At equilibrium, each instantaneous configuration of the gas may be regarded as random . Entropy may be understood as a measure of disorder within a macroscopic system . The second law of thermodynamics states that an isolated system's entropy never decreases . Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy . Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that amount . Since entropy is a function of the state of the system, a change in entropy of a system is determined by its initial and final states . This applies whether the process is reversible or irreversible . However, irreversible processes increase the combined entropy of the system and its environment . </P> <P> In the mid-19th century, the change in entropy (ΔS) of a system undergoing a thermodynamically reversible process was defined by Rudolf Clausius as: </P>

Entropy can be expressed as a function of