/
Pseudo-randomness Pseudo-randomness

Pseudo-randomness - PowerPoint Presentation

myesha-ticknor
myesha-ticknor . @myesha-ticknor
Follow
386 views
Uploaded On 2017-03-31

Pseudo-randomness - PPT Presentation

Shachar Lovett IAS Coding Complexity and Sparsity workshop August 2011 Overview Pseudorandomness what why Concrete examples Local independence Expander graphs Small space computations ID: 532114

pseudo random randomness expander random pseudo expander randomness bits graphs wise independence expansion regular spectral edge vertex space uniform

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Pseudo-randomness" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Pseudo-randomness

Shachar

Lovett (IAS)

Coding, Complexity and

Sparsity

workshop

August 2011Slide2

Overview

Pseudo-randomness – what? why?

Concrete examples

Local independence

Expander graphs

Small space computationsSlide3

What is pseudo-randomness?

Explicit

(mathematical) objects that

“look like” random objects

random objects?

“look like”?

explicit?Slide4

What is pseudo-randomness?

Explicit

(mathematical) objects that

“look like” random objects

Random object: random function, graph, …

Look like: some functions (distinguishers / tests) cannot distinguish a random object from a pseudorandom one

Explicit: can be generated efficientlySlide5

What is pseudo-randomness?

Example: pseudo-random bits

Random object

: uniform bits

U

n

{0,1}

n

Pseudo-random bits: random variable

X{0,1}nExplicit: Pseudorandom generator - X=G(Ur)G:{0,1}r  {0,1}n “easy to compute”Seed length r<<nTests: family of functionsSlide6

What is pseudo-randomness good for?

Derandomization

of randomized algorithms

Replace giving random bits to algorithm by “pseudo-random bits”

“look like”

: algorithm cannot distinguish random bits from pseudo-random bits

Explicit

: can generate pseudo-random bits efficiently

Derandomization

: enumerate all possibilities for pseudo-random bitsSlide7

What is pseudo-randomness good for?

Succinct

randomized data structures

Random data structures rely on randomness

Problem: randomness cannot be compressed

Solution: use pseudo-randomness instead

“look like”

: data structures still works

Pseudo-randomness can be

compressed Slide8

What is pseudo-randomness good for?

Tool in

extremal

combinatorics

: reduce proving

thms

to “random-looking” objects and “structured” objects

Example:

Szemeredi

theorem k,>0, n large enough, s.t. any subset of {1,…,n} of size n contains an arithmetic progression of length kProof [Gowers]: 1. Define good notion of pseudo-randomness for sets2. Prove for pseudo-random sets (easy for random sets) 3. Non pseudo-random sets are structured: use structureSlide9

What is pseudo-randomness good for?

Computational complexity

: benchmark of how well we understand a computational model

Computational model: family of functions

Following are usually in increasing hardness:

Find functions outside the computational model (i.e. cannot be computed exactly)

Find functions which cannot be approximated by the computational model

Construct pseudo-random generators for the modelSlide10

Local independence

:

pair-wise and k-wise independenceSlide11

Pair-wise independence

Family of functions

If is chosen uniformly,

then for any ,

h(x), h(y)

are uniform and independent, i.e.

Pair-wise independent bits: U={1,..,n}, m=2Slide12

Pair-wise independence: construction

Pair-wise independent bits:

n=2

m

; Identify [n]={0,1}

m

Construction (all operation modulo 2):

Proof: fix

x≠y

; choose h (i.e. a,b) uniformly If z≠0m: random variable <a,z>{0,1} is uniform Setting z=x+y: <a,x>+<a,y> uniformSo: (<a,x>+b, <a,y>+b){0,1}2 uniformSlide13

Application: succinct dictionaries

[Fredman-Komlós-Szemerédi’84]

Problem: Dictionary with

n entries

, O(1) retrieval time

Simple solution: hash to

O(n

2

)

locationsFKS: two-step hashing, using O(n) locationsRandomized procedures, so need to store randomnessPair-wise independence is crucial: Suffices to bound collisions (so algorithms still work)Can be represented succinctly (i.e. “compressed”)Slide14

k-wise independence

k

-wise independent bits:

uniform

Powerful

derandomization

tool, usually for k poly-logarithmic in n

Many randomized data structures

Some general computational models (AC

0,PTFs,…)Slide15

k-wise independence

AC

0

: constant depth circuits w. AND/OR gates

Cannot distinguish uniform random bits from

polylog

(n)-wise independence

[Braverman’09]

AND

OROR

AND

AND

AND

x

1

x

2

x

3

x

n

…Slide16

k-wise independence

PTF: sign of real polynomials

Basic building block in machine learning: neural network (degree 1) and support vector machines (higher degrees)

Cannot distinguish uniform random bits from

polylog

(n)-wise independence

[Kane’11]Slide17

Future research

Which models

fooled by

k-wise independence

?

More complex local independence

Designs:

sequences

of n bits of

hamming weight m, each k indices “look uniform”K-wise independent permutationsSlide18

Further reading

Pairwise

independence and

derandomization

,

by

Luby

and WigdersonSlide19

Expander graphsSlide20

Expander graphs

Expander graphs: graphs that

“look random”

Usually interested in

sparse graphs

Several notions of “randomness”

Combinatorial: edge/vertex expansion

Algebraic:

eigenvalues

Geometric: isoperimetric inequalities Slide21

Expander graphs

Expander graphs: graphs that

“look random”

Usually interested in

sparse graphs

Several notions of “randomness”

Combinatorial:

edge

/vertex expansion

Algebraic: eigenvaluesGeometric: isoperimetric inequalities Slide22

Applications of expander graphs

Expander graphs have numerous applications; explicit instances of random-looking graphs

Reliable networks

Error reduction of randomized algorithms

Codes with linear time decoding

Metric embeddings

Derandomization

of small space algorithms

…Slide23

Edge expansion

G=(V,E) d-regular graph, |V|=n

Edge boundary

of S

V

:

Edge expansion:

G is an

-edge expander

if h(G)≥ Slide24

Edge expansion: high connectivity

G=(V,E) d-regular graph, |V|=n

G is an

-edge expander

if

Thm

:

G is highly connected - if we

remove n edges

, largest connected component has ≥(1-)n verticesProof: If then Union of connected components is C1,…,Ck connected components: Slide25

Spectral expansion

G=(V,E) d-regular, |V|=n

A = adjacency matrix of G

Eigenvalues

of A: d=

1

≥

2

≥…≥n Second eigenvalue: (G)=max(|2|, |n|)G is a spectral expander if (G)<d d-(G) : eigenvalue gapSlide26

Spectral expansion: rapid mixing

G=(V,E) d-regular, |V|=n

G is a

spectral expander

if

(G)

<d

Thm

:

random walk on G mixes fast – from any initial vertex, a random walk of steps on edges of G ends in a nearly uniform vertexIn particular, the diameter of G is Slide27

Spectral expansion: rapid mixing

Thm

:

random walk on G

mixes

in steps

Random walk matrix on G:

Distribution on V:

Random walk:

Second eigenvalue:Probability of a random walk ending in vertex i: Slide28

Edge vs. Spectral expansion

Edge expansion: high connectivity

Spectral expansion: mixing of random walk

Are they equivalent? No, but highly related

[Cheeger’70,…;Dodziuk’84,Alon-Milman’85,Alon’86]Slide29

Vertex expansion

G=(V,E) d-regular graph, |V|=n

Vertex boundary

of S

V

:

G is an

(

,k)-vertex expander

if Slide30

Vertex expansion: unique neighbors

G is a

unique neighbor expander

if for every set S (of size ≤k) there exist an element in S which has a unique neighbor in G

Crucial for some applications, e.g. graph-based codes

If G is d regular,

(

,k)-vertex expander

for

>d/2 then G is a unique neighbor expanderSlide31

Constructions of expanders

Most constructions:

spectral expanders

Algebraic: explicit constructions

[Margulis’73,…]

Example (

Selberg

3/16 theorem)

Vertices:

3-regular graph: edges (x,x+1),(x,x-1),(x,1/x) Combinatorial: Zig-Zag [Reingold-Vadhan-Wigderson’02,…] Iterative construction of expanders from smaller onesSlide32

Optimal spectral expanders

G: d-regular graph

Optimal

spectral

expanders:

Ramanujan

Alon-Boppana

bound:

Achieved by some algebraic constructions

Are most d-regular graphs Ramanujan? openTheorem[Friedman]: >0 any large enough n, most d-regular graphs have Slide33

Future research

Spectral

expanders

:

well understood, explicit optimal constructions

Combinatorial

expansion

(edge/vertex):

follow to some degree from spectral expansion, but

not optimallyRequire other methods for construction (e.g. Zig-Zag)Current constructions far from optimalSome applications require tailor-made notions of “pseudo-random graphs”Slide34

Further reading

Expander

graphs and their applications

,

by

Hoory

,

Linial

and

Wigderson Slide35

Derandomizing

small space computationsSlide36

Small space computation

Model for computation with small memory, typically of logarithmic size

Application: streaming algorithms

Randomized computation: access to

random bits

Reads input from RAM, random bits from one way read-only tape

Combinatorial model: branching programsSlide37

Branching programs

Combinatorial model for small space programs

Input incorporated into program, reads random bits as input

Model: layered graph

layer = all memory configurations of algorithm

from each vertex two edges (labeled 0/1) go to next layer

x

1

x

2

x

3

x

4

0

1

1

0

start

accept

reject

accept

width =

2

memorySlide38

Pseudo-randomness for branching programs

Branching programs “remember” only a small number of bits in each step

Intuitively, this allows randomness to be recycled

This can be made formal – main ingredient is expander graphs! Slide39

Pseudo-randomness for branching programs

Generator for recycling randomness for branching programs

[Nisan’92, Imapgliazzo-Nisan-Wigderson’94]

Main idea: build generator recursively

G

0

: generator that output 1 bit

G

1

: generator that output 2 bits...Gi: generator that outputs 2i bits…Glogn: generator that outputs n bitsMain ingredient: efficient composition to get Gi+1 from GiSlide40

Composition

Assume

Want

Trivial composition:

Yields: , i.e. trivial…

Expander based composition

[INW]

:

Let H be a d-regular expander, vertices = (x,y) = random edge in H;Yields: How good should the expander be? Slide41

Composition

Expander based generator:

Based on d-regular expanders

seed =

INW: to fool width poly(n), enough to take

d=poly(n)

(i.e.

=poly(n)

-1) seed length = O(logn)2Main challenge: get seed length down to O(log n)Would allow enumeration of all seeds in poly(n) timeSlide42

Limited models

Can get optimal seed length, i.e. O(log n), for various limited models, e.g.

Random walks on undirected graphs

[…,Reingold’05]

Constant space reversible computations

[…,Koucky-Nimbhorkar-Pudlak’10]

Symmetric functions

[…,Gopalan-Meka-Reingold-Zuckerman’11]Slide43

Future research

Goal: seed length

O(log n)

for

width poly(n)

(i.e. log-space computations)

Interesting limited models, open:

Constant space computations (non-reversible)

Reversible log-spaceCombinatorial rectangles (derandomize “independent random variables”) Slide44

Summary

Pseudo-randomness: framework

Context of “random objects”

Define notion of pseudo-randomness via tests

This talk:

local independence

,

expander graphs

Applications:

Derandomization / compression of randomnessThis talk: small space algorithmsExtremal combinatoricsSlide45

THANK YOU