/
More Details on the Noisy Channel Rate Example More Details on the Noisy Channel Rate Example

More Details on the Noisy Channel Rate Example - PowerPoint Presentation

conchita-marotz
conchita-marotz . @conchita-marotz
Follow
414 views
Uploaded On 2017-11-13

More Details on the Noisy Channel Rate Example - PPT Presentation

CS 118 Computer Network Fundamentals Peter Reiher Split things Send a 0 Send a 1 Receive a 0 Receive a 1 5 0 1 5 More complex noise What is the capacity of this last channel ID: 605175

send log chance receive log send receive chance signal entropy channel receiving received calculating summation working sending

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "More Details on the Noisy Channel Rate E..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

More Details on the Noisy Channel Rate ExampleCS 118Computer Network Fundamentals Peter Reiher

Slide2

Split things . . . Send a 0

Send a 1

Receive a 0

Receive a 1

.5

0

1

.5

More complex noise . . .Slide3

What is the capacity of this last channel?How many bits per second are we effectively communicating?Rate of channel = H(x) + H(y) - H(x,y)Intuitively, the bits per second that the sender and receiver “share”Let’s calculate that for our exampleUsing information from the matrixWorking on the assumption that the sender is equally likely to send 0 or 1Slide4

So what is the conditional entropy for this channel?First we need the entropy of the source H(x)We also need the entropy of the receiver H(y)And the joint entropy H(x,y)Slide5

H(X)X is the original source of the informationH(X) = -∑pi log(pi) There are two possible signals 01Each is equally probableAccording to the definition of the scenarioSlide6

Calculating H(X)H(X) = -∑pi log(pi) H(X) = - (.5 log (.5) + .5 log (.5))H(X) = - (.5 (-1) + .5 (-1))H(X) = -(-.5 -.5) = -(-1) = 1H(X) = 1Slide7

H(Y)Y is the received signal, which was affected by noiseH(Y) is the entropy of that signalSince the signal received depends on the signal sent, the equation is a little different:H(Y) = -∑p(i,j)log(∑p(i,j))The probability that 0 or 1 was sent is still 50/50 Slide8

Calculating H(Y)The equation is H(Y) = -∑p(i,j)log(∑p(i,j)) First summation over i and j, second over iWe need the various p(i,j)’s, so let’s get thoseWhat does p(i,j) mean? The probability that signal i was sent and signal j received

Two possible signals sent or received

So four possible

p(i,j)

’s

The sum of all four is still 1Slide9

The p(i,j)sp(0,0) = .25p(0,1) = .25p(1,0) = .5p(1,1) = 0How did I get those?Slide10

Back to the matrix50% chance of sending 0If we send 0, 50% chance of receiving 0p(0,0) = .5*.5 = .25If we send 0, 50% chance of receiving 1p(0,1) = .5*.5 = .25

Receive a 0

Receive a 1

0

1

Send a 1

Send a 0

.5

.5Slide11

And for sending a 150% chance of sending 1If we send 1, 100% chance of receiving 0p(1,0) = .5 * 1 = .5If we send 1, 0% chance of receiving 1p(1,1) = .5 * 0 = 0

Receive a 0

Receive a 1

Send a 1

Send a 0

.5

.5

0

1Slide12

The p(i,j)sp(0,0) = .25p(0,1) = .25p(1,0) = .5p(1,1) = 0Slide13

Back to H(Y)H(Y) = -∑p(i,j)log(∑p(i,j))Remember, first summation over i and j, second over iH(Y) = -( p(0,0) log (p(0,0) + p(1,0)) + p(0,1) log (p(0,1) + p(1,1)) + p(1,0) log (p(0,0) + p(1,0)) + p(1,1) log (p(0,1) + p(1,1)))Fill in the p’sSlide14

Filling in the p’s for H(Y)H(Y) = -( .25 log (.25 + .5) + .25 log (.25 + 0) + .5 log (.25 + .5) + 0 log (.25 + 0))H(Y) = - ( .25 log (.75) + .25 log (.25) + .5 log (.75) + 0 log (.25) )Slide15

Working H(Y) outH (Y) = -( .25 * -.41 + .25 * -2 + .5 * -.41 + 0 * -2)H(Y) = - (-.105 - .5 - .205 + 0)H(Y) = -( -.81)H(Y) = .81Slide16

OK, now H(X,Y)H(x,y) = -∑p(i,j) log(p(i,j))Summation over both i and jH(x,y) = -(p(0,0) log(p(0,0)) + p(0,1) log(p(0,1)) + p(1,0) log(p(1,0)) + p(1,1) log(p(1,1)))We’ll need out p(i,j)’s again

Same ones as for H(Y)Slide17

The p(i,j)sp(0,0) = .25p(0,1) = .25p(1,0) = .5p(1,1) = 0Slide18

Calculating H(X,Y)H(x,y) = -(p(0,0) log(p(0,0)) + p(0,1) log(p(0,1)) + p(1,0) log(p(1,0)) + p(1,1) log(p(1,1)))H(x,y) = - (.25 log .25 + .25 log .25 + .5 log .5 + 0 log 0 )Slide19

Finishing the H(X,Y) calculationH(X,Y) = -(.25 * -2 + .25 * -2 + .5 * -1 + 0)H(X,Y) = -(-.5 - .5 - .5 + 0)H(X,Y) = -(-1.5)H(X,Y) = 1.5Slide20

Working it outR = H(x) + H(y) – H(x,y)R = 1 + .81 – 1.5R = .31We’re effectively communicating around 1/3 of a bit per second