<Dd> S = k B ln ⁡ (W) (\ displaystyle S = k_ (\ text (B)) \ ln (W)) </Dd> <P> where S is the thermodynamic entropy of a particular macrostate (defined by thermodynamic parameters such as temperature, volume, energy, etc .), W is the number of microstates (various combinations of particles in various energy states) that can yield the given macrostate, and k is Boltzmann's constant . It is assumed that each microstate is equally likely, so that the probability of a given microstate is p = 1 / W. When these probabilities are substituted into the above expression for the Gibbs entropy (or equivalently k times the Shannon entropy), Boltzmann's equation results . In information theoretic terms, the information entropy of a system is the amount of "missing" information needed to determine a microstate, given the macrostate . </P> <P> In the view of Jaynes (1957), thermodynamic entropy, as explained by statistical mechanics, should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics, with the constant of proportionality being just the Boltzmann constant . For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states of the system that are consistent with the measurable values of its macroscopic variables, thus making any complete state description longer . (See article: maximum entropy thermodynamics). Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total thermodynamic entropy does not decrease (which resolves the paradox). Landauer's principle imposes a lower bound on the amount of heat a computer must generate to process a given amount of information, though modern computers are far less efficient . </P> <P> Entropy is defined in the context of a probabilistic model . Independent fair coin flips have an entropy of 1 bit per flip . A source that always generates a long string of B's has an entropy of 0, since the next character will always be a' B' . </P>

Derivation of average information in data compression techniques