/
Vote privacy: Vote privacy:

Vote privacy: - PowerPoint Presentation

faustina-dinatale
faustina-dinatale . @faustina-dinatale
Follow
371 views
Uploaded On 2016-03-04

Vote privacy: - PPT Presentation

models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1 Aims and objectives Models are useful desirable Cryptographic proofs are not difficult Have yall do one cryptographic proof ID: 241602

vote encpk random privacy encpk vote privacy random models win knowledge result proofs scheme secrecy output encryption votes elgamal

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Vote privacy:" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Vote privacy: models and cryptographic underpinnings

Bogdan Warinschi University of Bristol

1Slide2

Aims and objectivesModels are useful, desirableCryptographic proofs are not difficult

Have y’all do one cryptographic proofHave y’all develop a zero-knowledge protocolHave y’all prove one property for a zero-knowledge protocol

2Slide3

Models3Slide4

Voting scheme4

v

1

v

n

v

2

(v

1

,v

2

,…,

v

n

)

 

Votes: v

1

,v

2

,…

v

n

in V

Result function:

:V* ResultsV={0,1}, (v1,v2,…,vn)= v1+v2+…+vn

 Slide5

Wish list

Eligibility: only legitimate voters vote; each voter votes onceFairness: voting does not reveal early results

Verifiability

: individual, universal

Privacy:

no information about the individual votes is revealed

Receipt-freeness

: a voter cannot prove s/he voted in a certain way

Coercion-resistance:

a voter cannot interact with a coercer to prove that s/he voted in a certain way

5Slide6

Design-then-break paradigm

6

…attack found

…attack

found

…attack

found

…no attack found

Guarantees

: no attack has been found

yetSlide7

Security models

7

Mathematical descriptions:

What a system is

How a system works

What is an attacker

What is a break

Advantages

:

clarify security notion; allows for security proofs (guarantees within clearly established boundaries)

Shortcomings

: abstraction – implicit assumptions, details are missing (e.g. trust in hardware, side-channels)Slide8

This talkPrivacy-relevant cryptographic primitivesAsymmetric encryptionNoninteractive

zero-knowledge proofsPrivacy-relevant techniquesHomomorphicityRerandomizationThreshold cryptography

Security models for encryption

Security models for vote secrecy (Helios)

8Slide9

Cryptographic security models9

Cryptographic system

 Slide10

Game based models10

Challenger

Query

Answer

0/1

Security

:

is secure if for any

adversary

the probability that the challenger outputs 1 is

close to some fixed constant (

typically 0, or ½)

 

 Slide11

Asymmetric Encryption schemes11Slide12

Syntax12

Setup(ν): fixes parameters for the scheme

KG(

params

)

: randomized algorithm that generates (PK,SK)

ENC

PK

(m)

: randomized algorithm that generates an encryption of m under PK

DEC

SK

(C)

: deterministic algorithm that calculates the decryption of C under skSlide13

Functional propertiesCorrectness:

for any PK,SK and M:DECSK (ENCPK (M))=M

Homomorphicity

:

For any PK, the function ENC

PK

( ) is

homomorphic

ENC

PK

(M

1) ENCPK

(M2) = ENC

PK(M1+M2)

 13Slide14

(exponent) ElGamal14

Setup(ν

)

: produces a description of (G,

) with generator g

KG

(G, g)

:

x

{1,…,|G

|};

X

g

x

output (

X,x)ENCX(m):

r {1,…,|G |};

(R,C) (gr,

gmXr);

output (R,C) DECx((R,C)): find

t such that gt=C/Rx

output m Slide15

Functional propertiesENCX

(m): (R,C) (gr,

g

m

X

r

);

output (R,C)

DEC

x

((R,C))

: find t such that gt=C/Rx output t

Correctness:

output t such that gt

= gmXr/

gxr = gmX

r/Xr=gm

Homorphicity:(gr

, gv1Xr)

(gs, gv2Xs) =

(gq, gv1+v2Xq)where q=r+s

 15Slide16

IND-CPA16

par

Setup()

(PK,SK )

Kg (par)

b

C

Enc

PK

(M

b

)

win

d=b

 

Public Key

PK

win

 

M

0

,M

ICGuess d

 

Theorem

:If

the DDH problem is hard in G then the

ElGamal

encryption scheme is

IND-CPA

secure.

Good definition?

is IND-CPA secure if

Pr

[win] ~ 1/2

 Slide17

Single pass voting scheme17Slide18

BB

Informal

18

C

1

ENC

PK

(v

1

)

 

P

1

: v

1

C

2

ENC

PK

(v

2

)

 P2: v2

Cn ENCPK(vn) Pn: v

n

C

1

C

2

C

n

SK

PK

Use SK to

obtain v

1

,…

v

n

. Compute and return

(v

1

,v

2

,…,

v

n

)

 Slide19

Syntax of SPS schemesSetup(ν): generates (

x,y,BB) secret information for tallying, public information parameters of the scheme, initial BB

Vote(

y,v

)

: the algorithm run by each voter to produce a ballot

b

Ballot(

BB,b

)

: run by the bulleting board; outputs new

BB

and accept/reject

Tallying(BB,x): run by the tallying authorities to calculate the final result

19Slide20

An implementation: Enc2Vote

=(KG,ENC,DEC)

be a

homomorphic

encryption scheme.

Enc2Vote(

) is:

Setup(

ν

)

:

KG

generates (SK,PK,[])

Vote(PK,v): b

ENCPK

(v)Process Ballot([BB],b): [BB] [BB,b

]Tallying([BB],x): where [BB] = [b1

b2,…,bn

] b = b1

b2

… bn

result DEC

SK(x,b) output result

 20Slide21

PK

Attack against privacy

21

SK

C

1

ENC

PK

(v

1

)

 

P

1

: v

1

C

2

ENC

PK

(v

2

P2: v2C1P3Assume that votes are either 0 or 1If the result is 0 or 1 then v1 was 0, otherwise v

1

was 1

C

1

C

2

C

1

FIX: weed out equal

ciphertexts

BB

Use SK to

obtain

v

1

,v

2

,

v

3

Out

(v

1

,v

2

, v

3

) = 2v

1

+

v

2

 Slide22

New attack 22

C

1

ENC

PK

(v

1

)

 

P

1

: v

1

C

2

ENC

PK

(v2

)

 P2: v2

CP3

PK

Calculate C0=ENCPK(0)and C=C1C0=ENCPK(v1)

 

C

1

C

2

C

FIX: Make sure

ciphertexts

cannot be mauled and weed out equal

ciphertexts

BB

SK

Use SK to

obtain

v

1

,v

2

,

v

3

Out

(v

1

,v

2

, v

3

) = 2v

1

+

v

2

 Slide23

Non-malleable encryption (NM-CPA)23

Params

Setup()

(PK,SK )

Kg (

params

)

b

C

Enc

PK

(M

b)

Mi

DecPK(Ci), for i

=1..n win

d=b

 

Public Key

PK

win

 

M

0

,M

1

C

Guess d

 

C

1

,

C

2

…,

C

n

M

1

,

M

2

,…,

M

n

Good definition?Slide24

ElGamal is not non-malleable24

Any homomorphic

scheme is malleable:

Given

Enc

PK

(m) can efficiently compute

Enc

PK

(m+1) (by multiplying with an encryption of 1)

For

ElGamal

:

submit 0,1 as the challenge messagesObtain c=(R,C)

Submit (R,C

g) for decryption. If response is 1, then b is 0, if response is 2 then b is 1

 Slide25

BB

0

BB

1

Ballot secrecy for SPS

[BCPSW11]

25

C

0

Vote

PK

(h

0

)

 

C

h

0

,h

1

C

1

C

C

1

Vote

PK

(h

1

)

 

Sees

BB

b

d

win

d=b

 

result

r

Tally

SK

(BB

0

)

 

C

0

C

C

PK

SK

win

b

 Slide26

26

Theorem: If

s a non-malleable encryption scheme then Env2Vote(

) has

ballot

secrecy.

 

PK

SK

h

0

,h

1

BB

C

C

1

ENC

PK

(h

b

)

 

d

result rF(H0

,V)

 

h

0

,h

1

C

1

,

C

2

,…,

C

t

d

v

1

,

v

2

,…,

v

t

PK

C

1

C

PKSlide27

27

Theorem: If

s a non-malleable encryption scheme then Env2Vote(

) has vote secrecy.

 

PK

SK

h

0

,h

1

BB

C’

C

ENC

PK

(h

b

)

 

d

result

r

F(H0,V)

 

h

0

,h

1

C

1

,

C

2

,…,

C

t

d

v

1

,

v

2

,…,

v

t

PK

C

C’

PK

Params

Setup()

(PK,SK )

Kg (

params

)

b

C

Enc

PK

(M

b

)

M

i

Dec

PK

(

C

i

), for

i

=1..n

win

d=b

 Slide28

Zero Knowledge Proofs

28Slide29

Interactive proofs29

w

X

M

1

M

2

M

3

M

n

Prover

Verifier

X

Wants to convince the Verifier that something is true about X. Formally that:

Rel

(

X,w

) for some w.

Variant: the

prover

actually knows such a w

Accept/

Reject

Examples:

Relg,h ((X,Y),z) iff X=gz and Y=hzRelg,X ((R,C),r) iff R=gr and C=Xr Relg,X ((R,C),r) iff R=gr and C/g=Xr

Rel

g,X

((

R,C),r

)

iff

(R=g

r

and

C=

X

r

) or (R=g

r

and

C/g=

X

r

)Slide30

Properties (informal)Completeness: an honest prover always convinces an honest verifier of the validity of the statement

Soundness: a dishonest prover can cheat only with small probabilityZero knowledge

: no other information is revealed

Proof of knowledge

: can extract witness from a successful

prover

30Slide31

Equality of discrete logs [CP92]Fix group G and generators g and h

Relg,h ((X,Y),z) = 1 iff

X=

g

z

and

Y=

h

z

P

V: U := gr

, V

:= hr (

where r is a random exponent)V → P: c

(where c is a random exponent)P

→ V: s

:= r + zc ; V checks:

gs=UX

c and hs=V

Yc 

31Slide32

CompletenessIf X=g

z and Y=hzP

V:

U

:=

g

r

, V

:= hrV → P: c

P →

V s := r +

zc ; V checks: gs=U

Xc and hs=V

YcCheck succeeds:

gs = gr+zc

= grgzc = U

Xc

 32Slide33

(Special) SoundnessFrom two different transcripts with the same first message can extract witness

((U,V),c0,s0) and ((U,V),c1,s1) such that:gs0=UX

c0

and

h

s0

=V

Y

c0

g

s1

=U

Xc1 and h

s1=V

Yc1 Dividing: gs0-s1=X

c0-c1 and hs0-s1=Yc0-c1

Dlogg X = (s0-s1)/(c0-c1) = Dlogh

Y

 

33Slide34

(HV) zero-knowledge34

R

c

s

Rel

(

X,w

)

X,w

X

There exists a simulator SIM that produces

transcripts that are indistinguishable from

those of the real execution.

R

c

s

XSlide35

Special zero-knowledge35

R

c

s

Rel

(

X,w

)

X,w

X

Simulator of a special form:

pick random c

pick random s

R

SIM(

c,s

)

 

R

c

s

XSlide36

Special zero-knowledge for CPAccepting transcripts:

((U,V),c,s) such that gs=U

X

c

and

h

s

=V

Y

c

Special simulator:

Select random c

Select random s

Set U= gs

X

c and V=hs

YcOutput ((U,V),c,s

)

 36Slide37

OR-proofs [CDS95,C96]37

R1

c1

s1

Rel1(

X,w

)

X,w

X

R2

c2

s2

Rel2(

Y,w

)

Y,w

Y

Design a protocol for Rel3(

X,Y,w

) where:

Rel3(

X,Y,w

)

iff

Rel1(

X,w) or Rel2(Y,w) Slide38

OR-proofs38

X,Y,w

R1

R2

c1

c2

s1

s2

X,Y

cSlide39

OR-proofs39

Rel1(

X,w

)

X,Y,w

R1

R2

c1=c-c2

c2

s1

s

2

X,Y

cSlide40

OR-proofs40

Rel1(X,w1)

X,Y,w

R1

R2

c1=c-c2

c2

c

1,s1

c

2,s2

X,Y

c

To verify: check that c1+c2=c and that (R1,c1,s1) and (R2,c2,s2) are accepting transcripts for the respective relations

. Slide41

Non-interactive proofs41

 

Prover

Verifier

X,w

XSlide42

The Fiat-Shamir/Blum transform42

R

c

s

Rel

(

X,w

)

X,w

X

R

s

X,w

X

c=H(X,R)

Theorem:

If

(P,V)

s

an honest verifier zero-knowledge

Sigma protocol , FS/B(

) is a

simulation-sound extractable

non-interactive zero-knowledge proof system (in the random oracle model).

 

The proof is (R,s).

To verify: compute c=H(R,s). Check (

R,c,s

) as beforeSlide43

ElGamal + PoKLet v

{0,1} and (R,C)=(gr,gv

X

r

)

Set

u=1-v

Pick:

c,s

at random

Set A

u

=

gsR-c , Set Bu=X

s (Cg

-u) –c

 

43Slide44

ElGamal + PoKPick A

v =ga, B

v

=

X

a

h

H(A

0

,B

0

,A

1

,B1)c’

h - cs’

Output ((R,C), A0,B

0,A1,B1,s,s’,c,c’)

 

44

Theorem: ElGamal+PoK

as defined is NM-CPA, in the random oracle model.

Theorem: Enc2Vote(ElGamal+PoK) has vote secrecy, in the random oracle model.Slide45

Random oracle [BR93,CGH98]Unsound heuristic

There exists schemes that are secure in the random oracle model for which any instantiation is insecureEfficiency vs security

45Slide46

Exercise: Distributed ElGamal decryption46

Party Pi

has

secret key xi, public

key : X

i

=

g

xi

Parties share secret key: x=x1

+ x2

+…+xk Corresponding public key: X=

X

i = gΣxi = gx

To decrypt (R,C): Party Pi computes: yi

Rxi ; Prove

that dlog (R,yi) =

dlog(g,Xi) Output

: C/y1y2…yk

= C/Rx

 Design a non interactive zero knowledge proof that Pi behaves correctlySlide47

Ballot secrecy vs. vote privacy47

Assume

(v

1

,v

2

,…,

v

n

) =

v

1

,

v2,…,vn

(v

1,v2,…,vn) =

v1+v2+…+vn

and the final result is 0 or nThe result function itself reveals information about the votes; ballot secrecy does not account for this loss of privacy

 Slide48

An Information theoretic approach To vote Privacy [BCPW12?]

48Slide49

Information theory49

Uncertainty regarding a certain value (e.g. the honest votes) = assume votes follow a distribution

X

.

(Use

distributions and random variables

interchangeably)

Assume that F measures the difficulty that an unbounded adversary has in predicting

X

(e.g. X

{m

0

,

m

1

} then

F(

X

)=1)

 Slide50

Conditional privacy measure50

Let

X

,

Y

distributed according to a joint distribution.

F(

X

|

Y

) measures uncertainty regarding

X

given that

Y

is observed (by an unbounded adversary):

F(

X

|

Y

)

F(X)If X and Y are independent F(

X | Y) = F(X)If X computable from Y then F(X

| Y) = 0If Y’ can be computed as a function of Y then F(X | Y) F(X | Y’)

 Slide51

Computational variant51

F(M| EncPK(

M

)) = ?Slide52

Computational variant52

F(M| Enc

PK

(

M

)) = 0 since

M

is

computable from

Enc

PK

(

M)How much uncertainty about X after a computationally bounded adversary sees Y?

Look at Y’ such that (

X,Y) (

X,Y’)Define: Fc

(X | Y)

if there exists Y’ such that (X,

Y) (

X,Y’) and F (X |

Y) =

 Slide53

Example53

Proof

:

Since ENC is IND-CPA secure then:

(

M

,

Enc

PK

(

M

))

(

M, Enc

PK(0

))Fc (M|

EncPK(M))

F(M| EncPK(0)) = F(

M)

 

Proposition:

If ENC is an IND-CPA encryption scheme then Fc (M| EncPK

(M)) F(M)

 Slide54

Variation54

Consider random variables: T (target),

L

(

leakeage

),

R

(result)

Define: F

c

(

T

|

L , R ) if there exists

L’ such that

(T , L , R )

(T , L’, R )and F(T

| L , R ) = r

 Slide55

Application to voting55

D distribution of honest votes

T:

supp

(

D

)

target function

T(v

1

,v

2

,…,

v

m

) = viT(v1

,v2,…,vm) = (vi =

vj)?Other

Adversary sees: his view viewA(

D, )

which includes (D ,

vA)

 Slide56

Measure(s) for vote privacy56

Definition: For D distribution on honest votes, T target function, and

protocol, define:

M

c

(T,

D

,

) =

inf

A

F

c

(T(D)|

viewA (D,

), (D ,

vA))Can also define an information theoretic notion:

M(T,D,) = inf

A F(T(D)|viewA

(D,),

(D ,vA)

) Slide57

Privacy of idealized protocols57

Proposition: M

(T,

D

,

) =

inf

A

F(T(

D

)|

(

D ,v

A))

 Slide58

BB

0

BB

1

Recall: vote secrecy for SPS

58

PK

C

0

ENC

PK

(h

0

)

 

C

h

0

,h

1

C

1

C

C

1

ENC

PK

(h

1

)

 

Sees

BB

b

d

win

d=b

 

result

r

Tally

SK

(BB

0

)

 

C

0

C

C

SK

winSlide59

BB

0

BB

1

Recall: vote secrecy for SPS

59

PK

C

0

ENC

PK

(0)

 

C

h

0

,0

C

1

C

C

1

ENC

PK

(h

1

)

 

Sees

BB

b

d

win

d=b

 

result

r

Tally

SK

(BB

0

)

 

C

0

C

C

SK

win

Proposition: Secure SPS implies that

(

D,view

A

(

D,

),

(

D

,

v

A

)

) ~

(

D,view

A

(

0,

),

(

D

,

v

A

)

)

 

D

Corollary: Secure SPS

for

implies that

M

c

(T,

D

,

) = M

(

T,

D

,

)

 

Proof:

M

c

(T,

D

,

) =

inf

A

F

c

(T(

D

)|

view

A

(

D,

),

(

D

,

v

A

)

)

inf

A

F

(T(

D

)|

view

A

(

0,

),

(

D

,

v

A

))

=

inf

A

F

(T(

D

)|

(

D

,

v

A

)) = M

(

T,

D

,

)

 Slide60

Relation with d-privacySet F to be average min-entropy

60Slide61

Choice of entropy Average min-entropy: measures the probability that an observer guesses the target function of the votesMin min-entropy

: measures the probability that an observer guesses the target function of the votes for the worst possible election outcomeMin Hartley entropy: measures the minimum number of values that the target function can take for any assignment of votes

61Slide62

NOT COVERED

62Slide63

Threshold decryption63

Party Pi has (xi, X

i

=

g

xi

);

x=x1

+ x2

+…+

xk

; X= Xi = g

Σxi

= gxTo decrypt (R,C):Pi

: yiRxi

; prove that dlog (yi

,R) = dlog(g,Xi)

Output: C/y1y2…y

k = C/Rx

 Slide64

Simulation-based models [Groth05]64

 

 

Security

:

is secure (as secure as

ideal

) if for any there exists a such that the overall outputs are indistinguishable

 Slide65

Games vs. simulation securityGamesNot always intuitive

Difficult to design: challenger/queries should reflect all potential uses of the system and permit access to all the information that can be gleanedSimulation

More

intuitive (for simple systems)

Too

demanding (e.g. adaptive security)

65Slide66

Relation with d-privacySet F to be average min-entropy

66Slide67

Dolev-Yao models [DKR09]Protocols specified in a process algebra (applied-pi calculus)

Vote secrecy: P[vote1/v1, vote2/v2] ≈ P[vote2/v1, vote1/v2]

Abstraction?

Relation with the game-based definition?

67Slide68

Incoercibility/Receipt freeness

68Slide69

Mix-nets69Slide70

Everlasting privacy70Slide71

Commitments71Slide72

Fully homomorphic encryption

72Slide73

ConclusionsModels (symbolic, computational) are importantModels, models, models…Proofs (symbolic, computational) are important

Proofs, proofs?A first step towards a privacy measure73Slide74

Thanks

74