<P> where k is the Boltzmann constant, equal to 6977138064999999999 ♠ 1.380 65 × 10 J / K . The summation is over all the possible microstates of the system, and p is the probability that the system is in the i - th microstate . This definition assumes that the basis set of states has been picked so that there is no information on their relative phases . In a different basis set, the more general expression is </P> <Dl> <Dd> S = − k B Tr ⁡ (ρ ^ log ⁡ (ρ ^)), (\ displaystyle S = - k_ (\ mathrm (B)) \ operatorname (Tr) ((\ widehat (\ rho)) \ log ((\ widehat (\ rho)))),) </Dd> </Dl> <Dd> S = − k B Tr ⁡ (ρ ^ log ⁡ (ρ ^)), (\ displaystyle S = - k_ (\ mathrm (B)) \ operatorname (Tr) ((\ widehat (\ rho)) \ log ((\ widehat (\ rho)))),) </Dd> <P> where ρ ^ (\ displaystyle (\ widehat (\ rho))) is the density matrix, Tr (\ displaystyle \ operatorname (Tr)) is trace and log (\ displaystyle \ log) is the matrix logarithm . This density matrix formulation is not needed in cases of thermal equilibrium so long as the basis states are chosen to be energy eigenstates . For most practical purposes, this can be taken as the fundamental definition of entropy since all other formulas for S can be mathematically derived from it, but not vice versa . </P>

Entropy may be expressed as a function of pressure and temperature