Download
# Secure Communication for Distributed Systems PowerPoint Presentation, PPT - DocSlides

pamella-moone | 2018-03-09 | General

** Tags : **
secure-communication-for-distributed-systems
adversary information
key coordination
information
adversary
coordination
key
node
source
payoff
secrecy
system
processor
game
communication
systems
message
distortion
empirical
rate
channel
### Presentations text content in Secure Communication for Distributed Systems

Show

Paul Cuff. Electrical Engineering. Princeton University. Overview. Application. A framework for secrecy of distributed systems. Theoretical result. Information theory in a competitive context (zero-sum game). ID: 644602

- Views :
**31**

**Direct Link:**- Link:https://www.docslides.com/pamella-moone/secure-communication-for-distributed-systems
**Embed code:**

Download this presentation

DownloadNote - The PPT/PDF document "Secure Communication for Distributed Sys..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

Slide1

Secure Communication for Distributed Systems

Paul CuffElectrical EngineeringPrinceton University

Slide2Overview

ApplicationA framework for secrecy of distributed systemsTheoretical resultInformation theory in a competitive context (zero-sum game)

Two methods of coordination

Slide3Main Idea

Secrecy for distributed systems

Design encryption specifically for a system objective

Node A

Node B

Message

Information

Action

Adversary

Distributed System

Attack

Slide4Communication in Distributed Systems

“Smart Grid”

Image from

http://www.solarshop.com.au

Slide5Example: Rate-Limited Control

Adversary

00101110010010111

Signal (sensor)

Communication

Signal (control)

Attack Signal

Slide6Example: Feedback Stabilization

“Data Rate Theorem”

[Wong-Brockett 99,

Baillieul

99]

Controller

Dynamic System

Encoder

Decoder

10010011011010101101010100101101011

Sensor

Adversary

Feedback

Slide7Traditional View of Encryption

Information inside

Slide8Shannon Analysis

1948Channel CapacityLossless Source CodingLossy Compression

1949 - Perfect Secrecy

Adversary learns nothing about the information

Only possible if the key is larger than the information

C. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.

Slide9Shannon Model

Schematic

Assumption

Enemy knows everything about the system except the key

Requirement

The decipherer accurately reconstructs the informationC. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.

Encipherer

Decipherer

Ciphertext

Key

Key

Plaintext

Plaintext

Adversary

For simple substitution:

Slide10Shannon Analysis

Equivocation vs RedundancyEquivocation is conditional entropy:Redundancy is lack of entropy of the source:

Equivocation reduces with redundancy:

C. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.

Slide11Computational Secrecy

Assume limited computation resourcesPublic Key EncryptionTrapdoor Functions

Difficulty not proven

Can become a

“cat and mouse” game

Vulnerable to quantum computer attackW. Diffie

and M. Hellman, “New Directions in Cryptography,” IEEE Trans. on Info. Theory, 22(6), pp. 644-654, 1976.

1125897758 834 689

524287

2147483647

X

Slide12Information Theoretic Secrecy

Achieve secrecy from randomness (key or channel), not from computational limit of adversary.Physical layer secrecyWyner’s

Wiretap Channel [

Wyner

1975]

Partial SecrecyTypically measured by “equivocation:”Other approaches:Error exponent for guessing eavesdropper [Merhav 2003]Cost inflicted by adversary [this talk]

Slide13Equivocation

Not an operationally defined quantityBounds:List decodingAdditional information needed for decryption

Not concerned with

structure

Slide14Our Framework

Assume secrecy resources are available (secret key, private channel, etc.)How do we encode information optimally?

Game

T

heoretic

Eavesdropper is the adversarySystem performance (for example, stability) is the payoffBayesian gamesInformation structure

Slide15Competitive Distributed System

Node A

Node B

Message

Key

Information

Action

Adversary

Attack

Encoder:

System payoff: .

Decoder:

Adversary:

Slide16Zero-Sum Game

Value obtained by system:

Objective

Maximize payoff

Node A

Node B

Message

Key

Information

Action

Adversary

Attack

Slide17Secrecy-Distortion Literature

[Yamamoto 97]:Cause an eavesdropper to have high reconstruction distortion

Replace payoff (

π

) with

distortionNo causal information to the eavesdropperWarning: Problem statement can be too optimistic!

Slide18How to Force High Distortion

Randomly assign binsSize of each bin is Adversary only knows bin

Reconstruction of only depends on the marginal posterior distribution of

Example (Bern(1/3)):

Slide19Theoretical Results

Information Theoretic Rate RegionsProvable Secrecy

Slide20Two Categories of Results

Lossless TransmissionSimplex interpretation

Linear program

Hamming Distortion

General Reward Function

Common InformationSecret Key

Slide21Competitive Distributed System

Node A

Node B

Message

Key

Information

Action

Adversary

Attack

Encoder:

System payoff: .

Decoder:

Adversary:

Slide22Zero-Sum Game

Value obtained by system:

Objective

Maximize payoff

Node A

Node B

Message

Key

Information

Action

Adversary

Attack

Slide23Theorem:

[Cuff 10]

Lossless Case

Require Y=X

Assume a payoff function

Related to Yamamoto’s work [97]

Difference: Adversary is more capable with more information

Also required:

Slide24Linear Program on the Simplex

Constraint:

Minimize:

Maximize:

U will only have mass at a small subset of points (extreme points)

Slide25Linear Program on the Simplex

Slide26Binary-Hamming Case

Binary Source:Hamming DistortionOptimal approachReveal excess 0’s or 1’s to condition the hidden bits

0

1

0

0

1

00001*

*00*

*0*0

*Source

Public message

Slide27Binary Source (Example)

Information source is

Bern(p)

Usually zero (p < 0.5)

Hamming payoff

Secret key rate R

0

required to guarantee eavesdropper errorR0

pEavesdropper Error

Slide28General Payoff Function

No requirement for lossless transmission.

Any payoff function

π

(

x,y,z)Any source distribution (i.i.d.)

Adversary:

Slide29Payoff-Rate Function

Maximum achievable average payoff

Markov relationship:

Theorem:

Slide30Unlimited Public Communication

Maximum achievable average payoff

Conditional common information:

Theorem (R=∞):

Slide31Related Communication Methods

Two Coordination Results

Slide32Coordination Capacity

References:[C., Permuter, Cover – IT Trans. 09][C. - ISIT 08]

[Bennett,

Shor

,

Smolin, Thapliyal – IT Trans. 02][C., Zhao – ITW 11]Ability to coordinate sequences (“actions”) with communication limitations.Empirical CoordinationStrong Coordination

Slide33X1 X2 X3 X4 X5 X6 …

XnEmpirical Coordination

Y

1 Y2 Y3 Y4 Y5 Y6 …

Y

nZ1 Z2 Z3 Z4 Z5 Z6 … Zn

Empirical Distribution

Slide34Empirical Distribution

1 0 1 1 0 0 0 1

0 1 1 0 1 0 1 1

1 1 0 1 0 0 1 0

000

001

010

011

100

101

110

111

Slide35Average Distortion

Average values are a function of the empirical distributionExample: Squared error distortion

Rate distortion theory fits in the empirical coordination context.

Slide36No Rate – No Channel

No explicit communication channel

Signal

“A” serves an analog and information role.

Analog: symbol-by-symbol relationship

(Digital): uses complex structure to carry information.Processor 1

Processor 2

Source

Actuator 1

Actuator 2

Slide37Define Empirical Coordination

Processor 1

Processor 2

Source

i

s achievable if:

Slide38Coordination Region

The coordination region

gives

us all results concerning average distortion.

Processor 1

Processor 2

Source

Slide39Result – No constraints

Processor 1

Processor 2

Source

Achievability: Make a codebook of (A

n

,

B

n

) pairs

Slide40General Results

Variety of causality constraints (delay)

Finite Look-ahead

Processor 1

Processor 2

Source

Slide41Alice and Bob Game

Alice and Bob want to cooperatively score points by both correctly guessing a sequence of random binary numbers (one point if they both guess correctly).Alice gets entire sequence ahead of timeBob only sees that past binary numbers and guesses of Alice.

What is the optimal score in the game?

Slide42Alice and Bob Game (answer)

Online Matching Pennies[Gossner, Hernandez, Neyman

, 2003]

“Online Communication”

Solution

Slide43General (causal) solution

Score in Alice and Bob Game is a first-order statisticAchievable

empirical distributions

(Processor 2 is strictly causal

)

Surprise: Bob doesn’t need to see the past of the sequence.

Slide44X1 X2 X3 X4 X5 X6 …

XnStrong Coordination

Y

1 Y2 Y3 Y4 Y5 Y6 …

Y

nZ1 Z2 Z3 Z4 Z5 Z6 … Zn

Joint distribution of sequences is

i.i.d

.

with respect to the desired joint distribution.(Allow epsilon total variation distance.)

Slide45Point-to-point Coordination

Theorem [C. 08]:

Strong Coordination involves picking a V such that X-V-Y

Message: R > I(X;V)

Common Randomness: R

0 + R > I(X,Y;V)Uses randomized decoder (channel from V to Y)

Node A

Node B

Message

Common Randomness

Source

Output

Synthetic Channel p(

y|x

)

Slide46Zero-Sum Game

Value obtained by system:

Objective

Maximize payoff

Node A

Node B

Message

Key

Information

Action

Adversary

Attack

Slide47Encoding Scheme

Coordination StrategiesEmpirical coordination for U

Strong coordination for Y

K

Slide48Converse

Slide49What the Adversary doesn’t know

can

hurt him.

[Yamamoto 97]

Knowledge of Adversary:

[Yamamoto 88]:

Slide50Proposed View of Encryption

Information obscured

Images from albo.co.uk

Today's Top Docs

Related Slides