/
Generating Random Spanning Trees via Fast Matrix Multiplication Generating Random Spanning Trees via Fast Matrix Multiplication

Generating Random Spanning Trees via Fast Matrix Multiplication - PowerPoint Presentation

ellena-manuel
ellena-manuel . @ellena-manuel
Follow
399 views
Uploaded On 2018-03-17

Generating Random Spanning Trees via Fast Matrix Multiplication - PPT Presentation

Keyulu Xu University of British Columbia Joint work with Nick Harvey TexPoint fonts used in EMF Read the TexPoint manual before you delete this box A A A What are random s panning ID: 654609

sample update samplecrossing samplewithin update sample samplewithin samplecrossing random graph spanning reff edge return partition trees time edges formula

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Generating Random Spanning Trees via Fas..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Generating Random Spanning Trees via Fast Matrix Multiplication

Keyulu XuUniversity of British ColumbiaJoint work with Nick Harvey

TexPoint fonts used in EMF.

Read the TexPoint manual before you delete this box.:

A

A

ASlide2

What are random s

panning trees?Goal: Output a uniformly random spanning treeMore precisely:T(G) = set of all spanning trees of graph GOutput a spanning tree T

such that T’ is equal to any of the trees in T(G) with prob. 1/|T(G)|Note: |T(G)| can be exponentially largeSlide3

Why random spanning trees?

Fundamental probabilistic object [K’1847]Connections to trending topics in theoretical computer science: spectral graph

theory, electrical

network

flows, and graph structuresMany applications in theoretical computer science, computer networks, statistical physics

Recreation!

(

Generating random maze puzzles)Slide4

Why random spanning trees?

Applications in theoretical computer scienceEstimating the coefficients of the reliability polynomial [CDM’88]Generating expander graphs [GRV’09]Traveling salesman problem [AGMO’10] [OSS’11

]

Submodular

optimization [CVZ’10]Slide5

How to generate RST?

Laplacian-based algorithms [CMN’96]Matrix Tree theorem [Kirchoff 1847]

Pr[e in RST] = Reff(e)For each edge e:Compute Reff

(e)

and add

e to T with prob Reff(e)Update G by contracting e

if e is in T and delete e o.w.Running time O(

n

ω

)Slide6

How to generate RST?

Random walks [Border’89, Aldous’90 ]Start a random walk at some vertex sWhenever visiting a new vertex v, add to T

the edge through which we reached

v

Running time = O(cover time) = O(mn)Can get O(mean hitting time)

but still O(

mn

) in the worst case [Wilson’96]Slide7

How to generate RST?

Approximate algorithms [KM’09, MST’15]Nearly-linear time approximate Laplacian system solversAccelerate the random graph algorithms by identifying regions where the random walk will be slowMost efficient on sparse graphsSlide8

Our work

A new Laplacian-based algorithm for generating random spanning trees with running time O(nω)Advantages:

Conceptually

simpler and cleaner

No need to introduce directed graphsUses fast matrix multiplication as a black box, thus avoiding the intricate details of LU-decomposition

Uses a

simple recursion framework

that could be applied to many graph algorithms Slide9

Main idea

Pr[e in RST] = Reff(e)For e = {u, v}, Reff(e)

=

X

uT L+ XvX is the characteristic function

L

is the

Laplacian matrix of the graph GL = D - ASlide10

Main idea

Pr[e in RST] = Reff(e)For e = {u, v}, Reff(e)

=

XuT L+ XvFor each edge e:Compute

Reff

(e)

and add e to T with prob Reff(e)Update

L+ by contracting e if e is in T and delete e

o.w

Faster

way to update

L

+

?

Lazy updates

Visit the edges in a specific order

Recursion!Slide11

Main idea

For e = {u, v}, Reff(e) = XuT L+

X

v

Observation: Reff(e) only depends on 4 entries of L+

 lazy updates!

Need formula for updating

L+ whenever we contract or delete edges

Sherman-Morrison-Woodbury formula for inverse updatesSherman-Morrison-like formula for Laplacian pseudo-inverseSlide12

Pseudo-inverse update formulas

Theorem 1 (Update formula for deletion)Let G = (V, E) be a connected, undirected graph and D ⊆ E. If G∖D contains at least one spanning tee, then

(L – L

D

)+ = L+ - L+(L

D

L

+ - I)-1L

DL+Update running time O(|V|ω

)

, can we do better?

No need to update/clean up the entire matrixSlide13

Pseudo-inverse update formulas

Corollary 1 (Improved update formula)Let G = (V, E) be a connected, undirected graph and S ⊆ V. Define E[S]

as the set of edges whose vertices are in

S

. Suppose D ⊆ E[S] and G∖D contains at least one spanning tee, then(L – L

D

)

+S,S = L+

S,S - L+S,S (LD_

S,S

L

+

S,S

- I)-1

L

D_

S,S

L

+

S,S

Runtime of each update is

O(|S|

ω

)!Slide14

Pseudo-inverse update formulas

Contract edges while avoiding the cumbersome updates to the Laplacian that result from decreasing the number of verticesOur approach: increase the weight of the edge to a large value k. This is equivalent to contracting the edge in the limit as k→∞.Can obtain similar update formula for contraction!

Note:

The object of interest to us is L

+ and L+ does have a finite limit as k→∞.Slide15

Recursive algorithm

Our pseudo-inverse update formulas indicate that if we have only made sampling decisions for edges in a submatrix, then we can update the submatrix of L+ with a small cost, using values from that submatrix.

Recursively sample edges in

submatrices

!Apply update formulas to clean up the matrixSlide16

Recursive Decomposition of Graph

Define:E[S] = { {u,v} : u,v∈S and {u,v}

E }

“within”E[

S

1

,S2] = { {u,v

} : u∈S1, v

S

2

,

and {u,v}∈

E }

“crossing”

Claim

: Let

S

=

S

1

S

2

. Then

E[

S

]

=

E[S

1

]

E[S

2

]

E[S

1

,S

2

]

E[

S

]

S

1

S

2

E[S

1

]

E[S

2

]

E[S

1

,S

2

]Slide17

Recursive Decomposition of Graph

Define:E[S] = { {u,v} : u,v∈S and {u,v}

E }

“within”E[

R

,

S] = { {u,v} :

u∈R, v∈S, and {

u,v

}

E }

“crossing”

Claim

: Let

R=R

1

R

2

and

S=S

1

S

2

. Then

E[

R

,

S

] =

E[

R

1

,

S

1

]

E[

R

1

,

S

2

]

⋃ E[R

2,

S1] ⋃

E[R2,

S2]

R

SSlide18

Recursive Decomposition of Graph

Define:E[

S

]

= { {u,v} : u,v∈S and {u,v}∈

E }

“within”

E[R,S] = { {u,v} :

u

R

,

v∈S,

and {

u,v

}

E }

“crossing”

Claim

: Let

R=R

1

R

2

and

S=S

1

S

2

. Then

E[

R

,

S

] =

E[R

1

,S

1

]

E[R

1

,S

2

] ⋃ E[R

2,S1]

⋃ E[R2

,S2]

R

S

R

1

S

2

R

2

S

1Slide19

Sample

Within(

S

)

If |S

|=1 then Return

Partition

S=S1

⋃S2

For

i

{1,2}

Sample

Within

(

S

i

)

Update

N

S

,

S

Sample

Crossing

(

S

1

,

S

2

)

GeneratingRST

(

G=(

V

,E)

)

Construct

L

and

N

=

L

+

SampleWithin

(

V

)

Sample

Crossing

(

R

,

S

) If |R|=|

S|=1

Try to sample R-S edge;

Return

Partition R=R1⋃

R2

and S=S1

⋃S2

For i∈

{1,2} and j∈{1,2}

SampleCrossing

(Ri

,Sj)

Update NR,

S

V

S

1

S

2

E[S

1

]

E[S

2

]

E[S

1

,S

2

]

Sample

Within

(S

1

)

Sample

Within

(S

2

)

Sample

Crossing

(S

1

,S

2

)

Recursively

sample edgesSlide20

SampleWithin

(S

)

If |

S

|=1 then Return

Partition

S=S1⋃

S2 For

i

{1,2}

SampleWithin

(

S

i

)

Update

N

S

,

S

SampleCrossing

(

S

1

,

S

2

)

SampleCrossing

(

R

,

S

)

If |

R

|=|

S

|=1

Try to

sample R

-S edge;

Return

Partition

R=R

1

R

2

a

nd

S=S

1

S2

For i

∈{1,2} and j∈{1,2}

SampleCrossing(

Ri,S

j) Update N

R,S

S

1

S

2

SampleWithin

(S

1

)

SampleWithin

(S

2

)

SampleCrossing

(S

1

,S

2

)

GeneratingRST

(

G=(

V

,E)

)

Construct

L

and

N

=

L

+

Sample

Within

(

V

)

Recursively

sample edgesSlide21

SampleWithin

(S

)

If |

S

|=1 then Return

Partition

S=S1⋃

S2 For

i

{1,2}

SampleWithin

(

S

i

)

Update

N

S

,

S

SampleCrossing

(

S

1

,

S

2

)

SampleCrossing

(

R

,

S

)

If |

R

|=|

S

|=1

Try to

sample R

-S edge;

Return

Partition

R=R

1

R

2

a

nd

S=S

1

S2

For i

∈{1,2} and j∈{1,2}

SampleCrossing(

Ri,S

j) Update N

R,S

Sample

Within

(S

1

)

SampleWithin

(S

2

)

SampleCrossing

(S

1

,S

2

)

S

1

S

2

GeneratingRST

(

G=(

V

,E)

)

Construct

L

and

N

=

L

+

SampleWithin

(

V

)

Recursively

sample edgesSlide22

SampleWithin

(

S

)

If |

S

|=1 then Return

Partition

S=S

1

S

2

For

i

{1,2}

SampleWithin

(

S

i

)

Update

N

S

,

S

SampleCrossing

(

S

1

,

S

2

)

SampleCrossing

(

R

,

S

)

If |

R

|=|

S

|=1

Try to delete R-S edge;

Return

Partition

R=R

1

R

2

a

nd

S=S

1⋃S2

For i∈{1,2} and

j∈{1,2}

SampleCrossing(R

i,Sj)

Update NR

,S

SampleWithin

(S

1

)

SampleWithin

(S

2

)

SampleCrossing

(S

1

,S

2

)

S

1

S

2

Try to

sample R

-S edge;

Return

GeneratingRST

(

G=(

V

,E)

)

Construct

L

and

N

=

L

+

SampleWithin

(

V

)

Recursively

sample edgesSlide23

Runtime Analysis

g(n) = 4∙g(n/2) + O(n)

f(

n

) = 2∙f(n/2)+g(n)+O(n

)

 g(n

) = O(n)

f(

n

) = O(

n

)

SampleWithin

(

S

)

If |

S

|=1 then Return

Partition

S=S

1

S

2

For

i

{1,2}

SampleWithin

(

S

i

)

Update N

S

,

S

SampleCrossing

(

S

1

,

S

2

)

SampleCrossing

(

R

,

S

)

If |

R

|=|

S

|=1

Try to

sample R-S edge;

Return

Partition R=R

1⋃R

2 a

nd S=S1

⋃S2

For i∈{1,2} and

j∈{1,2}

SampleCrossing(

Ri,S

j) Update

NR,

S

By Sherman-Morrison-Woodbury Formula, can do each

update in O(n

) timeRuntime: f(n), where n=|S|

Runtime: g(n), where n=|R|=|S|

Total runtime of algorithm is O(n

) timeSlide24

Conclusion

Generating random spanning trees is a fundamental and important problem in TCSWe provided a simpler algorithm that also achieves O(nω)Open questions:

Bridging the generation of random spanning trees for dense and sparse graphs?