Top Banner
CHANNEL CAPACITY Topics Included:- Introduction Channel Capacity Nyquist rate Shannon’s Theorem Signal and Noise Signal to Noise Ratio Bandwidth Relation between Noise,Channel Capacity,Bandwidth and Signal Why Binary is Best??? Channel Coding Theorem Advantages
22
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: PDC PPT

CHANNEL CAPACITY

Topics Included:- Introduction Channel Capacity Nyquist rate Shannon’s Theorem Signal and Noise Signal to Noise Ratio Bandwidth Relation between Noise,Channel Capacity,Bandwidth and Signal Why Binary is Best??? Channel Coding Theorem Advantages

Page 2: PDC PPT

INTRODUCTION…

Channel capacity, shown often as "C" in communication formulas, is the amount of discrete information bits that a defined area or segment in a communications medium can hold. Thus, a telephone wire may be considered a channel in this sense.

Breaking up the frequency bandwidth into smaller sub-segments, and using each of them to carry communication results in a reduction in the number of bits of information that each segment can carry. The total number of bits of information that the entire wire may carry is not expanded by breaking it into smaller sub-segments.

In reality, this sub-segmentation reduces the total amount of information that the wire can carry due to the additional overhead of information that is required to distinguish the sub-segments from each other.

Page 3: PDC PPT

CONTD…

Two additional considerations: The channel can be light beams, radio waves, a specified bandwidth, a book, or even elementary particles from whose state information may be gleaned.

In addition, an analog signal, such as the human voice, can be computed in terms of the elements of difference that can be detected, and this can be counted and considered "bits" of information

Page 4: PDC PPT

CHANNEL CAPACITY…

The bit rate of a system increases with an increase in the number of signal levels we use to denote a symbol.

A symbol can consist of a single bit or “n” bits. The number of signal levels = 2n. As the number of levels goes up, the spacing

between level decreases -> increasing the probability of an error occurring in the presence of transmission impairments.

Page 5: PDC PPT

NYQUIST RATE…

Nyquist gives the upper bound for the bit rate of a transmission system by calculating the bit rate directly from the number of bits in a symbol (or signal levels) and the bandwidth of the system (assuming 2 symbols/per cycle and first harmonic).

Nyquist theorem states that for a noiseless channel:

C = 2 B log22n

C= capacity in bps

B = bandwidth in Hz

Page 6: PDC PPT

SHANNON’S THEOREM…

Any discussion about the design of a communication system will be incomplete without mentioning Shannon's Theorem. Shannon's information theory tell us the amount of information a channel can carry. In other words it specifies the Capacity of the channel. The theorem can be stated in simple terms as follows:-

A given communication system has a maximum rate of information C known as the channel capacity.

If the information rate R is less than C, then one can approach arbitrarily small error probabilities by using intelligent coding techniques.

Page 7: PDC PPT

CONTD…

To get lower error probabilities, the encoder has to work on longer blocks of signal data. This entails longer delays and higher computational requirements.

The Shannon-Hartley theorem indicates that with sufficiently advanced coding techniques transmission at channel capacity can occur with arbitrarily small error. One can intuitively reason that, for a given communication system, as the information rate increases the number of errors per second will also increase.

Page 8: PDC PPT

CONTD…

Shannon - Hartley Equation :

C = B log2(1+S/N)

Where C is the Maximum capacity of the channel in bits/second , B is the bandwidth of the channel in hertz , S is the signal power and N is the noise power. It can be ascertained that the maximum rate at which we can transmit information is set by the bandwidth, the signal level, and the noise level.

Page 9: PDC PPT

CONTD…

The expression of the channel capacity of the channel makes intuitive sense:

As the bandwidth of the channel increases, it is possible to make faster changes in the information signal, thereby increasing the information rate.

As S/N increases, one can increase the information rate while still preventing errors due to noise.

For no noise, S/N = Infinity 1and an infinite information rate is possible irrespective of bandwidth.

Thus we may trade off bandwidth for SNR.(S/N-Signal to Noise Ratio).

However, as B -> Infinity, the channel capacity does not become infinite since, with an increase in bandwidth, the noise power also increases.

Page 10: PDC PPT

CONTD…

Here the channel encoder encodes the information W1W2... using Channel coding techniques and the coded data is represented as X1X2. When this codeword travels across the channel it gets corrupted by noise and are received as Y1Y2. Channel decoder recovers the original information (estimates the content in the signal). The accuracy of the recovered information depends on the coding technique , Bandwidth of the channel, code rate , Signal strength and noise power.

Page 11: PDC PPT

SIGNAL AND NOISE…

Signal is the information you want to transmit Noise is just another signal, added to and

interfering with the signal you want to transmit Some noise is random and unavoidable and

comes from natural sources Some noise is intentional and is actually

someone else’s signal A party can be “noisy” even though most of the

“noise” is just conversations other than yours!

Page 12: PDC PPT

CONTD…

If the noise is “soft” it is easy to pick out the signal If the noise is “loud” it introduces many errors into the

received signal In a digital communications channel the noise level affects

the channel capacity “Loud” noise can be compensated for by channel coding, at

the expense of lower data rate Recall Shannon’s Channel Coding Theorem: Error rate can

be made as close to zero as desired, as long as the rate at which bits are transmitted does not exceed the channel capacity

Page 13: PDC PPT

SIGNAL TO NOISE RATIO…

“Loudness” of signal and noise are their power

The key parameter is the

Signal to Noise Ratio = SNR = S/N

where S = signal power, N = noise power High SNR = clearer signal = higher channel

capacity

Page 14: PDC PPT

Restoration of Digital Signals

We know that a fundamental advantage of digital representation over analog is that data can be restored

?

We know that a fundamental advantage of digital representation over analog is that data can be restored

0 1

Page 15: PDC PPT

CONTD…

If signals had four possible levels rather than two, in each time slice two bits of information could be transmitted.

If the levels were closer together, the thresholding would be harder -- same noise => more errors

If levels were same distance apart, need more power

Page 16: PDC PPT

BANDWIDTH,LITERALLY…

Bandwidth = the width of a frequency range E.g. the AM band is 530-1700 KHz, for a bandwidth of about

1200 KHz Within a band, signals (e.g. radio stations) have to be kept a

certain distance apart to avoid interference (could not have stations at both 1030 and 1031 KHz)

More bandwidth => more “stations,” “channels,” i.e., more channel capacity

With more bandwidth it is possible to transmit more information

Page 17: PDC PPT

SIGNAL,NOISE,CHANNEL CAPACITY,BANDWIDTH

These four are interrelated Stronger signal (S) => higher channel capacity C More noise (N) => lower C More bandwidth (B) => higher C

C = B lg (1+S/N)

Shannon-Hartley Theorem So channel capacity increases linearly with bandwidth

but logarithmically with signal-to-noise ratio

Page 18: PDC PPT

WHY BINARY IS BEST???

Usually noise is uncontrollable So to increase channel capacity, the engineer must

increase either bandwidth or signal power Power is a precious resource! Use more power in a PC or cell phone => bigger

battery, shorter battery life, etc. With only two signal levels, power usage is minimized To achieve a fixed data rate, can use 1000x less power

if we can get 10x more bandwidth!

Page 19: PDC PPT

CHANNEL CODING THEOREM…

For a given channel,there exists a code that will permit the error-free transmission across the channel at a rate R,provided R<=C,the channel capacity.

Equality is achieved only when SNR is infinite.

Channel capacity is non dimensional number.

Page 20: PDC PPT

CONTD…

Increase the rate by adding information calculated from the source data.

Allows the detection or correction of signal errors introduced by the transmission medium.

Coding Algorithm and coding rate vary for different transport channels for different types of data.

Page 21: PDC PPT

ADVANTAGES…

Structured redundancy is added to the message information.

Its chosen so that bits of the transmitted message are algebrically related.

Page 22: PDC PPT

THANK YOU…

Made by:- Jyoti Choudhary Aanchal Shekhawat Akanksha Singh