/
Rate-distortion Theory for Secrecy Systems Rate-distortion Theory for Secrecy Systems

Rate-distortion Theory for Secrecy Systems - PowerPoint Presentation

cheryl-pisano
cheryl-pisano . @cheryl-pisano
Follow
378 views
Uploaded On 2018-09-22

Rate-distortion Theory for Secrecy Systems - PPT Presentation

Paul Cuff Electrical Engineering Princeton University Secrecy Source Channel Information Theory Secrecy Source Coding Channel Coding Source Coding Describe an information signal source ID: 675382

secrecy channel source information channel secrecy information source coordination strong adversary distortion theory total rate message distribution optimal node

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Rate-distortion Theory for Secrecy Syste..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Rate-distortion Theory for Secrecy Systems

Paul CuffElectrical EngineeringPrinceton UniversitySlide2
Slide3

Secrecy

Source

Channel

Information Theory

Secrecy

Source Coding

Channel CodingSlide4

Source Coding

Describe an information signal (source

) with a message.

Encoder

Decoder

Message

Information

ReconstructionSlide5

Entropy

If X

n

is

i.i.d

. according to

p

X

R > H(X) is necessary and sufficient for

lossless

reconstruction

Enumerate the typical set

Space of

X

n

sequencesSlide6

Many Methods

For lossless source coding, the encoding method is not so importantIt should simply use the full entropy of the bitsSlide7

Single Letter Encoding (method 1)

Encode each Xi separatelyUnder the constraints of

decodability

, Huffman codes are optimal

Expected length is within one bit of entropy

Encode tuples of symbols to get closer to the entropy limitSlide8

Random Binning(method 2)

Assign to each Xn

sequence a random bit sequence (hash function)

0110100010010

Space of

X

n

sequences

1101011000100

0100110101011Slide9

Linear Transformation(method 3)

X

n

J

Random Matrix

Source

MessageSlide10

Summary

For lossless source coding, structure of communication doesn’t matter much

Message Bits Received

Information

Gathered

H(

X

n

)Slide11

Lossy Source Coding

What if the decoder must reconstruct with less than complete information?

Error probability

will be close to one

Distortion as a performance metric

 Slide12

Poor Performance

Random Binning and Random Linear Transformations are useless!

Message Bits Received

Distortion

E d(

X,y

)

 

 

Time Sharing

Massey Conjecture:

Optimal for linear codesSlide13

Puzzle

Describe an n-bit random sequenceAllow 1 bit of distortion

Send only 1 bitSlide14

Rate Distortion Theorem

[Shannon]Choose p(

y|x

):

 Slide15

Structure of Useful Partial Information

Coordination (Given source P

X

construct

Y

n

~ P

Y|X

)

Empirical

Strong

 Slide16

Empirical Coordination Codes

CodebookRandom subset of Y

n

sequences

Encoder

Find the

codeword

that has the right joint first-order statistics with the sourceSlide17

Strong Coordination

Black box acts like a

memoryless

channel

X and Y are an

i.i.d

. multisource

Source

Output

P

Y|X

Communication ResourcesSlide18

Strong Coordination

Related to:

Reverse Shannon Theorem [Bennett et. al.]

Quantum Measurements [Winter]

Communication Complexity [

Harsha

et.

a

l.]

Strong Coordination [C.-Permuter-Cover]Generating Correlated R.V. [Anantharam, Gohari

, et. al.]

Node A

Node B

Message

Common Randomness

Source

Output

Synthetic Channel P

Y|XSlide19

Structure of Strong Coord.

KSlide20

Information Theoretic SecuritySlide21

Wiretap Channel

[Wyner 75]Slide22

Wiretap Channel

[Wyner 75]Slide23

Wiretap Channel

[Wyner 75]Slide24

Confidential Messages

[Csiszar, Korner 78]Slide25

Confidential Messages

[Csiszar, Korner 78]Slide26

Confidential Messages

[Csiszar, Korner 78]Slide27

Merhav 2008Slide28

Villard-Piantanida 2010Slide29

Other Examples of

“rate-equivocation” theoryGunduz-

Erkip

-Poor 2008

Lia

-H. El-

Gamal

2008

Tandon-Ulukus-Ramchandran 2009…Slide30

Rate-distortion theory (secrecy)Slide31

Achievable Rates and Payoff

Given

[

Schieler

, Cuff 2012 (ISIT)]Slide32

How to Force High Distortion

Randomly assign binsSize of each bin is Adversary only knows bin

Adversary has no knowledge of

only knowledge ofSlide33

Causal DisclosureSlide34

Causal Disclosure (case 1)Slide35

Causal Disclosure (case 2)Slide36

Example

Source distribution is Bernoulli(1/2).Payoff: One point if Y=X but Z≠X.Slide37

Rate-payoff RegionsSlide38

General Disclosure

Causal or non-causalSlide39

Strong Coord. for Secrecy

Node A

Node B

Information

Action

Adversary

Attack

Channel Synthesis

Not optimal use of resources!Slide40

Strong Coord. for Secrecy

Node A

Node B

Information

Action

Adversary

Attack

Channel Synthesis

Reveal auxiliary U

n

“in the clear”

U

nSlide41

Payoff-Rate Function

Maximum achievable average payoff

Markov relationship:

Theorem:Slide42

Structure of Secrecy Code

KSlide43

Intermission

Equivocation nextSlide44

Log-loss Distortion

Reconstruction space of Z is the set of distributions.Slide45

Best Reconstruction Yields Entropy Slide46

Log-loss

(disclose X causally)

 Slide47

Log-loss

(disclose Y causally)

 Slide48

Log-loss

(disclose X and Y)

 Slide49

Result 1 from Secrecy R-D TheorySlide50

Result 2 from Secrecy R-D TheorySlide51

Result 3 from Secrecy R-D TheorySlide52

Some Difficulties

In point-to-point, optimal communication produces a stationary performance.The following scenarios lend themselves to time varying performance.Slide53

Secure Channel

Adversary does not observe the messageOnly access to causal disclosure

Problem: Not able to isolate strong and empirical coordination.

Empirical coordination provides short-duration strong coordination.

Hard to prove optimality.Slide54

Side Information at the intended receiver

Again, even a communication scheme built only on empirical coordination (covering) provides a short duration of strong coordinationPerformance reduces in stages throughout the block.Slide55

Cascade NetworkSlide56

Inner and Outer BoundsSlide57

Summary

To assist an intended receiver with partial information while hindering an adversary with partial secrecy

, a new encoding method is needed.

Equivocation is characterized by this rate-distortion theory

Main new encoding feature:

Strong Coordination

superpositioned

over revealed information

(a.k.a. Reverse Shannon Theorem or Distributed Channel Synthesis)

In many cases (e.g. side information; secure communication channel; cascade network), this distinct layering may not be possible.Slide58

Restate Problem---Example 1 (RD Theory)

Can we design:

such that

Does there exists a distribution:

Standard

Existence of Distributions

f

gSlide59

Restate Problem---Example 2 (Secrecy)

Can we design:

such that

Does there exists a distribution:

Standard

Existence of Distributions

f

g

Eve

Score

[Cuff 10]Slide60

Tricks with Total Variation

TechniqueFind a distribution

p

1

that is easy to analyze and satisfies the relaxed constraints.

Construct

p

2

to satisfy the hard constraints while maintaining small total variation distance to

p1.

How?

Property 1:Slide61

Tricks with Total Variation

TechniqueFind a distribution

p

1

that is easy to analyze and satisfies the relaxed constraints.

Construct

p

2

to satisfy the hard constraints while maintaining small total variation distance to

p1.

Why?

Property 2 (bounded functions):Slide62

Summary

Achievability Proof Techniques:Pose problems in terms of

existence of joint distributions

Relax Requirements to

“close in total variation”

Main Tool --- Reverse Channel Encoder

Easy Analysis of Optimal Adversary

Secrecy Example: For arbitrary

²

, does there exist a distribution satisfying:Slide63

Cloud Overlap Lemma

Previous EncountersWyner, 75 --- used divergence

Han-

Verdú

, 93 --- general channels, used total variation

Cuff 08, 09, 10, 11 --- provide simple proof and utilize for secrecy encoding

P

X|U

(

x|u

)

Memoryless

ChannelSlide64

Reverse Channel Encoder

For simplicity, ignore the key K, and consider J

a

to be the part of the message that the adversary obtains. (i.e. J = (

J

a

,

J

s

), and ignore Js for now)Construct a joint distribution between the source Xn

and the information Ja

(revealed to the Adversary) using a memoryless channel.

P

X|U

(

x|u

)

Memoryless

ChannelSlide65

Simple Analysis

This encoder yields a very simple analysis and convenient properties

If |

J

a

| is large enough, then

X

n

will be nearly

i.i.d. in total variationPerformance:

P

X|U

(

x|u

)

Memoryless

ChannelSlide66

Summary

Achievability Proof Techniques:

Pose problems in terms of

existence of joint distributions

Relax Requirements to

“close in total variation”

Main Tool ---

Reverse Channel Encoder

Easy Analysis

of Optimal Adversary