<P> Both these bounds are derived directly from the Chernoff bound . It can also be shown that, </P> <Dl> <Dd> Pr (X ≥ k) = F (n − k; n, 1 − p) ≥ 1 (n + 1) 2 exp ⁡ (− n D (k n p)) if p <k n <1 . (\ displaystyle \ Pr (X \ geq k) = F (n-k; n, 1 - p) \ geq (\ frac (1) ((n + 1) ^ (2))) \ exp \ left (- nD \ left ((\ frac (k) (n)) \ left \ right p \ right) \ right) \ quad \ quad (\ mbox (if)) p <(\ frac (k) (n)) <1. \!) </Dd> </Dl> <Dd> Pr (X ≥ k) = F (n − k; n, 1 − p) ≥ 1 (n + 1) 2 exp ⁡ (− n D (k n p)) if p <k n <1 . (\ displaystyle \ Pr (X \ geq k) = F (n-k; n, 1 - p) \ geq (\ frac (1) ((n + 1) ^ (2))) \ exp \ left (- nD \ left ((\ frac (k) (n)) \ left \ right p \ right) \ right) \ quad \ quad (\ mbox (if)) p <(\ frac (k) (n)) <1. \!) </Dd> <P> This is proved using the method of types (see for example chapter 12 of Elements of Information Theory by Cover and Thomas). </P>

Mean and variance are equal in binomial distribution