/
Secure Communication for Distributed Systems Secure Communication for Distributed Systems

Secure Communication for Distributed Systems - PowerPoint Presentation

pamella-moone
pamella-moone . @pamella-moone
Follow
392 views
Uploaded On 2018-03-09

Secure Communication for Distributed Systems - PPT Presentation

Paul Cuff Electrical Engineering Princeton University Overview Application A framework for secrecy of distributed systems Theoretical result Information theory in a competitive context zerosum game ID: 644602

adversary information key coordination information adversary coordination key node source payoff secrecy system processor game communication systems message distortion empirical rate channel

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Secure Communication for Distributed Sys..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Secure Communication for Distributed Systems

Paul CuffElectrical EngineeringPrinceton UniversitySlide2

Overview

ApplicationA framework for secrecy of distributed systemsTheoretical resultInformation theory in a competitive context (zero-sum game)

Two methods of coordinationSlide3

Main Idea

Secrecy for distributed systems

Design encryption specifically for a system objective

Node A

Node B

Message

Information

Action

Adversary

Distributed System

AttackSlide4

Communication in Distributed Systems

“Smart Grid”

Image from

http://www.solarshop.com.auSlide5

Example: Rate-Limited Control

Adversary

00101110010010111

Signal (sensor)

Communication

Signal (control)

Attack SignalSlide6

Example: Feedback Stabilization

“Data Rate Theorem”

[Wong-Brockett 99,

Baillieul

99]

Controller

Dynamic System

Encoder

Decoder

10010011011010101101010100101101011

Sensor

Adversary

FeedbackSlide7

Traditional View of Encryption

Information insideSlide8

Shannon Analysis

1948Channel CapacityLossless Source CodingLossy Compression

1949 - Perfect Secrecy

Adversary learns nothing about the information

Only possible if the key is larger than the information

C. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.Slide9

Shannon Model

Schematic

Assumption

Enemy knows everything about the system except the key

Requirement

The decipherer accurately reconstructs the informationC. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.

Encipherer

Decipherer

Ciphertext

Key

Key

Plaintext

Plaintext

Adversary

For simple substitution:Slide10

Shannon Analysis

Equivocation vs RedundancyEquivocation is conditional entropy:Redundancy is lack of entropy of the source:

Equivocation reduces with redundancy:

C. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.Slide11

Computational Secrecy

Assume limited computation resourcesPublic Key EncryptionTrapdoor Functions

Difficulty not proven

Can become a

“cat and mouse” game

Vulnerable to quantum computer attackW. Diffie

and M. Hellman, “New Directions in Cryptography,” IEEE Trans. on Info. Theory, 22(6), pp. 644-654, 1976.

1125897758 834 689

524287

2147483647

XSlide12

Information Theoretic Secrecy

Achieve secrecy from randomness (key or channel), not from computational limit of adversary.Physical layer secrecyWyner’s

Wiretap Channel [

Wyner

1975]

Partial SecrecyTypically measured by “equivocation:”Other approaches:Error exponent for guessing eavesdropper [Merhav 2003]Cost inflicted by adversary [this talk]Slide13

Equivocation

Not an operationally defined quantityBounds:List decodingAdditional information needed for decryption

Not concerned with

structureSlide14

Our Framework

Assume secrecy resources are available (secret key, private channel, etc.)How do we encode information optimally?

Game

T

heoretic

Eavesdropper is the adversarySystem performance (for example, stability) is the payoffBayesian gamesInformation structureSlide15

Competitive Distributed System

Node A

Node B

Message

Key

Information

Action

Adversary

Attack

Encoder:

System payoff: .

Decoder:

Adversary:Slide16

Zero-Sum Game

Value obtained by system:

Objective

Maximize payoff

Node A

Node B

Message

Key

Information

Action

Adversary

AttackSlide17

Secrecy-Distortion Literature

[Yamamoto 97]:Cause an eavesdropper to have high reconstruction distortion

Replace payoff (

π

) with

distortionNo causal information to the eavesdropperWarning: Problem statement can be too optimistic!Slide18

How to Force High Distortion

Randomly assign binsSize of each bin is Adversary only knows bin

Reconstruction of only depends on the marginal posterior distribution of

Example (Bern(1/3)):Slide19

Theoretical Results

Information Theoretic Rate RegionsProvable SecrecySlide20

Two Categories of Results

Lossless TransmissionSimplex interpretation

Linear program

Hamming Distortion

General Reward Function

Common InformationSecret KeySlide21

Competitive Distributed System

Node A

Node B

Message

Key

Information

Action

Adversary

Attack

Encoder:

System payoff: .

Decoder:

Adversary:Slide22

Zero-Sum Game

Value obtained by system:

Objective

Maximize payoff

Node A

Node B

Message

Key

Information

Action

Adversary

AttackSlide23

Theorem:

[Cuff 10]

Lossless Case

Require Y=X

Assume a payoff function

Related to Yamamoto’s work [97]

Difference: Adversary is more capable with more information

Also required:Slide24

Linear Program on the Simplex

Constraint:

Minimize:

Maximize:

U will only have mass at a small subset of points (extreme points)Slide25

Linear Program on the SimplexSlide26

Binary-Hamming Case

Binary Source:Hamming DistortionOptimal approachReveal excess 0’s or 1’s to condition the hidden bits

0

1

0

0

1

00001*

*00*

*0*0

*Source

Public messageSlide27

Binary Source (Example)

Information source is

Bern(p)

Usually zero (p < 0.5)

Hamming payoff

Secret key rate R

0

required to guarantee eavesdropper errorR0

pEavesdropper ErrorSlide28

General Payoff Function

No requirement for lossless transmission.

Any payoff function

π

(

x,y,z)Any source distribution (i.i.d.)

Adversary:Slide29

Payoff-Rate Function

Maximum achievable average payoff

Markov relationship:

Theorem:Slide30

Unlimited Public Communication

Maximum achievable average payoff

Conditional common information:

Theorem (R=∞):Slide31

Related Communication Methods

Two Coordination ResultsSlide32

Coordination Capacity

References:[C., Permuter, Cover – IT Trans. 09][C. - ISIT 08]

[Bennett,

Shor

,

Smolin, Thapliyal – IT Trans. 02][C., Zhao – ITW 11]Ability to coordinate sequences (“actions”) with communication limitations.Empirical CoordinationStrong CoordinationSlide33

X1 X2 X3 X4 X5 X6 …

XnEmpirical Coordination

Y

1 Y2 Y3 Y4 Y5 Y6 …

Y

nZ1 Z2 Z3 Z4 Z5 Z6 … Zn

Empirical DistributionSlide34

Empirical Distribution

1 0 1 1 0 0 0 1

0 1 1 0 1 0 1 1

1 1 0 1 0 0 1 0

000

001

010

011

100

101

110

111Slide35

Average Distortion

Average values are a function of the empirical distributionExample: Squared error distortion

Rate distortion theory fits in the empirical coordination context.Slide36

No Rate – No Channel

No explicit communication channel

Signal

“A” serves an analog and information role.

Analog: symbol-by-symbol relationship

(Digital): uses complex structure to carry information.Processor 1

Processor 2

Source

Actuator 1

Actuator 2Slide37

Define Empirical Coordination

Processor 1

Processor 2

Source

i

s achievable if:Slide38

Coordination Region

The coordination region

gives

us all results concerning average distortion.

Processor 1

Processor 2

SourceSlide39

Result – No constraints

Processor 1

Processor 2

Source

Achievability: Make a codebook of (A

n

,

B

n

) pairsSlide40

General Results

Variety of causality constraints (delay)

Finite Look-ahead

Processor 1

Processor 2

SourceSlide41

Alice and Bob Game

Alice and Bob want to cooperatively score points by both correctly guessing a sequence of random binary numbers (one point if they both guess correctly).Alice gets entire sequence ahead of timeBob only sees that past binary numbers and guesses of Alice.

What is the optimal score in the game?Slide42

Alice and Bob Game (answer)

Online Matching Pennies[Gossner, Hernandez, Neyman

, 2003]

“Online Communication”

SolutionSlide43

General (causal) solution

Score in Alice and Bob Game is a first-order statisticAchievable

empirical distributions

(Processor 2 is strictly causal

)

Surprise: Bob doesn’t need to see the past of the sequence.Slide44

X1 X2 X3 X4 X5 X6 …

XnStrong Coordination

Y

1 Y2 Y3 Y4 Y5 Y6 …

Y

nZ1 Z2 Z3 Z4 Z5 Z6 … Zn

Joint distribution of sequences is

i.i.d

.

with respect to the desired joint distribution.(Allow epsilon total variation distance.)Slide45

Point-to-point Coordination

Theorem [C. 08]:

Strong Coordination involves picking a V such that X-V-Y

Message: R > I(X;V)

Common Randomness: R

0 + R > I(X,Y;V)Uses randomized decoder (channel from V to Y)

Node A

Node B

Message

Common Randomness

Source

Output

Synthetic Channel p(

y|x

)Slide46

Zero-Sum Game

Value obtained by system:

Objective

Maximize payoff

Node A

Node B

Message

Key

Information

Action

Adversary

AttackSlide47

Encoding Scheme

Coordination StrategiesEmpirical coordination for U

Strong coordination for Y

KSlide48

ConverseSlide49

What the Adversary doesn’t know

can

hurt him.

[Yamamoto 97]

Knowledge of Adversary:

[Yamamoto 88]:Slide50

Proposed View of Encryption

Information obscured

Images from albo.co.uk