Misplaced Pages

Shannon–Hartley theorem: Difference between revisions

Article snapshot taken from[REDACTED] with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 18:56, 11 January 2006 editRbj (talk | contribs)3,805 edits Capacity of the additive white Gaussian noise channel: generalization of formula and some cosmetics.← Previous edit Revision as of 18:59, 11 January 2006 edit undoRbj (talk | contribs)3,805 edits Capacity of the additive white Gaussian noise channel: a little ooops.Next edit →
Line 45: Line 45:


where where
:''C'' is the ] in ], net of error correction; :''C'' is the channel capacity in bits per second, net of error correction;
:''BW'' is the ] of the channel in ]; :''BW'' is the bandwidth of the channel in Hz;
: ''S(f)'' is the signal ] and : ''S(f)'' is the signal ]
: ''N(f)'' is the total noise power spectrum. : ''N(f)'' is the noise power spectrum
: ''f'' is frequency in Hz.



For large or small and constant signal-to-noise ratios, this formula can be approximated. For large or small and constant signal-to-noise ratios, this formula can be approximated.

Revision as of 18:59, 11 January 2006

In information theory, the Shannon–Hartley theorem is an application of the noisy channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The result establishes the maximum amount of error-free digital data (that is, information) that can be transmitted over such a communication link with a specified bandwidth in the presence of the noise interference. The law is named after Claude Shannon and Ralph Hartley. The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel.

Introduction

Noisy channel coding theorem

Claude Shannon's noisy channel coding theorem (1948) describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption. The theory doesn't describe how to construct the error-correcting method, it only tells us how good the best possible method can be.

It establishes that given a noisy channel with information capacity C and information transmitted at a rate R, then if

R < C {\displaystyle R<C\,}

there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small. This means that theoretically, it is possible to transmit information without error up to a limit, C.

The converse is also important. If

R C {\displaystyle R\geq C}

the probability of error at the receiver increases without bound as the rate is increased. So no useful information can be transmitted beyond the channel capacity.

Shannon-Hartley theorem

The Shannon-Hartley theorem establishes what that channel capacity is, for a finite bandwidth continuous-time channel subject to Gaussian noise.

If we had such a thing as an infinite-bandwidth, noise-free analog channel we could transmit unlimited amounts of error-free data over it per unit of time. However real life signals have both bandwidth and noise-interference limitations.

So how do bandwidth and noise affect the rate at which information can be transmitted over an analog channel?

Surprisingly, bandwidth limitations alone do not impose a cap on maximum information transfer. This is because it is still possible (at least in a thought-experiment model) for the signal to take on an infinite number of different voltage levels on each cycle, with each slightly different level being assigned a different meaning or bit sequence. If we combine both noise and bandwidth limitations, however, we do find there is a limit to the amount of information that can be transferred, even when clever multi-level encoding techniques are used. This is because the noise signal obliterates the fine differences that distinguish the various signal levels, limiting in practice the number of detection levels we can use in our scheme.

Capacity of the additive white Gaussian noise channel

Considering all possible multi-level and multi-phase encoding techniques, the theorem gives that the theoretical maximum rate of clean (or arbitrarily low bit error rate) data C with a given average signal power that can be sent through an analog communication channel subject to additive, white, Gaussian-distribution noise interference is:

C = B W × log 2 ( S + N N ) = B W × log 2 ( 1 + S N ) {\displaystyle C=BW\times \log _{2}\left({\frac {S+N}{N}}\right)=BW\times \log _{2}\left(1+{\frac {S}{N}}\right)}

where

C is the channel capacity in bits per second, net of error correction;
BW is the bandwidth of the channel in hertz;
S is the total signal power over the bandwidth and
N is the total noise power over the bandwidth.
S/N is the signal-to-noise ratio of the communication signal to the Gaussian noise interference expressed as a straight power ratio (not as decibels).

Normally the signal and noise are fully uncorrelated and in that case S + N is the total power of the received signal and noise together. A generalization of the above equation for the case where the additive noise is not white (or that the S/N is not constant in frequency over the bandwidth) is:

C = 0 B W log 2 ( S ( f ) + N ( f ) N ( f ) ) d f = 0 B W log 2 ( 1 + S ( f ) N ( f ) ) d f {\displaystyle C=\int _{0}^{BW}\log _{2}\left({\frac {S(f)+N(f)}{N(f)}}\right)df=\int _{0}^{BW}\log _{2}\left(1+{\frac {S(f)}{N(f)}}\right)df}

where

C is the channel capacity in bits per second, net of error correction;
BW is the bandwidth of the channel in Hz;
S(f) is the signal power spectrum
N(f) is the noise power spectrum
f is frequency in Hz.

For large or small and constant signal-to-noise ratios, this formula can be approximated.

If S/N >> 1, C = 0.332 · BW · SNR (in dB).

If S/N << 1, C = 1.44 · BW · S/N (in power).

The V.34 modem standard advertises a rate of 33.6 kbit/s, and V.90 claims a rate of 56 kbit/s, apparently in excess of the Shannon limit (telephone bandwidth is 3.3 kHz). In fact, neither standard actually reaches the Shannon limit. The bandwidth is not the limiting factor because it is possible and common for modems to transmit many bits per symbol. The actual limit is the signal to noise ratio which is dependant upon the underlying plant installation. V.90 uses a clever technique that assumes the local cable from the customer site to the office equipment is free of noise and that the conversion to PCM is the only disturbance. It then maps data bits onto the equivalent voltages for the PCM codecs used in the standard telephone network(s). This only works downstream (CO to customer) and the upstream is still a V.34 variant.

Examples

  1. If the S/N is 20 dB, and the bandwidth available is 4 kHz, which is appropriate for telephone communications, then C = 4 log2(1 + 100) = 4 log2 (101) = 26.63 kbit/s. Note that the value of 100 is appropriate for an S/N of 20 dB.
  2. If it is required to transmit at 50 kbit/s, and a bandwidth of 1 MHz is used, then the minimum S/N required is given by 50 = 1000 log2(1+S/N) so S/N = 2 -1 = 0.035 corresponding to an S/N of -14.5 dB. This shows that it is possible to transmit using signals which are actually much weaker than the background noise level, as in spread-spectrum communications.

References

See also

External links

Categories:
Shannon–Hartley theorem: Difference between revisions Add topic