<P> Surprisingly, bandwidth limitations alone do not impose a cap on maximum information rate . This is because it is still possible for the signal to take on an indefinitely large number of different voltage levels on each symbol pulse, with each slightly different level being assigned a different meaning or bit sequence . If we combine both noise and bandwidth limitations, however, we do find there is a limit to the amount of information that can be transferred by a signal of a bounded power, even when clever multi-level encoding techniques are used . </P> <P> In the channel considered by the Shannon--Hartley theorem, noise and signal are combined by addition . That is, the receiver measures a signal that is equal to the sum of the signal encoding the desired information and a continuous random variable that represents the noise . This addition creates uncertainty as to the original signal's value . If the receiver has some information about the random process that generates the noise, one can in principle recover the information in the original signal by considering all possible states of the noise process . In the case of the Shannon--Hartley theorem, the noise is assumed to be generated by a Gaussian process with a known variance . Since the variance of a Gaussian process is equivalent to its power, it is conventional to call this variance the noise power . </P> <P> Such a channel is called the Additive White Gaussian Noise channel, because Gaussian noise is added to the signal; "white" means equal amounts of noise at all frequencies within the channel bandwidth . Such noise can arise both from random sources of energy and also from coding and measurement error at the sender and receiver respectively . Since sums of independent Gaussian random variables are themselves Gaussian random variables, this conveniently simplifies analysis, if one assumes that such error sources are also Gaussian and independent . </P> <P> Comparing the channel capacity to the information rate from Hartley's law, we can find the effective number of distinguishable levels M: </P>

When will shannon equation gives the capacity zero