<P> In various science / engineering applications, such as independent component analysis, image analysis, genetic analysis, speech recognition, manifold learning, evaluation of the status of biological systems and time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations . </P> <P> The simplest and most common approach uses histogram - based estimation, but other approaches have been developed and used, each with their own benefits and drawbacks . The main factor in choosing a method is often a trade - off between the bias and the variance of the estimate although the nature of the (suspected) distribution of the data may also be a factor . </P> <P> The simple way of evaluation of a probability distribution f (x) (\ displaystyle f (x)) of biological variable with the entropy normalized by its maximum value (H m a x = log ⁡ n (\ displaystyle H_ (max) = \ log n)), </P> <Dl> <Dd> H (X) = − ∑ i = 1 n f (x i) log ⁡ f (x i) H m a x (\ displaystyle H (X) = - (\ frac (\ sum _ (i = 1) ^ (n) f (x_ (i)) \ log f (x_ (i))) (H_ (max)))) </Dd> </Dl>

Sample estimate of entropy of a random vector