Tensor Network States Algorithms and Applications Beijing December 2014 Tensor Network Renormalization Guifre Vidal Evenbly Vidal arXiv14120732 Tensor Renormalization Methods What is the usefulness of renormalization or coarsegraining in manybody physics ID: 379619
Download Presentation The PPT/PDF document "Glen Evenbly" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Glen Evenbly
Tensor Network States: Algorithms and Applications, Beijing, December 2014
Tensor Network Renormalization
Guifre
Vidal
(
Evenbly
, Vidal, arXiv:1412.0732)Slide2
Tensor Renormalization MethodsWhat is the usefulness of renormalization (or coarse-graining) in many-body physics???
description in terms of
very many microscopic
degrees of freedom
Iterative RG transformations
…but they don’t give you this!
Previous methods based upon tensor renormalization can be very powerful and useful….
Previous
tensor RG methods
do
not address all short-ranged
degrees of freedom at each RG
step.
Some
(unwanted) short ranged detail always remains
in the coarse-grained system
description in terms of a
few effective
(low-energy) degrees of freedom
each transformation removes short-range (high energy) degrees of freedom
effective theory should contain only universal information (
i.e. no microscopic details remaining
)Slide3
Tensor Renormalization Methods
Consequences:
they do not give a
proper
RG flow (i.e. give wrong RG fixed points)
Tensor Network Renormalization (
TNR
)
A method of coarse-graining tensor networks that addresses
all short-ranged degrees of freedom
at each RG step
(
Evenbly
, Vidal, arXiv:1412.0732)
Consequences:
gives a
proper
RG flow (i.e. correct RG fixed points)
gives a
sustainable
RG transformation (even at or near criticality)
accumulation of short-ranged degrees of freedom can cause
computational breakdown (at or near criticality
)
Previous
tensor RG methods
do
not address all short-ranged
degrees of freedom at each RG
step.
Some
(unwanted) short ranged detail
always remains
in the coarse-grained
systemSlide4
Outline: Tensor Network Renormalization
Introduction
:
Tensor networks and methods based upon tensor renormalization
C
omparison
: Tensor Renormalization
Group (TRG) vs T
ensor Network Renormalization (TNR)
Discussion:
Failure of previous tensor RG methods to address all short-ranged degrees of freedom
Reformulation:
A different prescription for implementing tensor RG methods.
Resolution:
how to build a tensor RG scheme that addresses all short-ranged degrees of freedomSlide5
Tensor Renormalization Methods
evolution in imaginary time
1D lattice in space
C
an express the exact ground state of quantum system (in terms of a Euclidean path integral) as a tensor network:Slide6
Tensor Renormalization Methods
1D lattice in space
C
an express the exact ground state of quantum system (in terms of a Euclidean path integral) as a tensor network:
1D quantum system with Hamiltonian
H
:
Ground stateSlide7
Tensor Renormalization Methods
C
an express the exact ground state of quantum system (in terms of a Euclidean path integral) as a tensor network:
1D quantum system with Hamiltonian
H
:
There are many different approaches to evaluate a network of this form (e.g. Monte-
carlo
, transfer matrix methods, tensor RG…)
Tensor RG
: evaluate expectation value through a sequence of controlled (quasi-exact) coarse-graining transformations of the network
Expectation valueSlide8
Tensor Renormalization Methods
Sequence of coarse graining transformations applied to the tensor network:
C
ould represent:
Euclidean path integral of
1D quantum
system
Partition function of
2
D classical
statistical system
Expectation value of local observable, two-point
correlator
etcSlide9
Tensor Renormalization Methods
Sequence of coarse graining transformations applied to the tensor network:
RG flow in the space of tensors:
steps
scalar
width
NSlide10
RG flow in the space of tensors:
steps
scalar
Tensor
R
enormalization Group (TRG)
First tensor RG approach:
(Levin, Nave, 2006)
Tensor Renormalization Methods
Tensor RG approaches borrow from earlier ideas (e.g.
Kadanoff
spin blocking)
Incurs a
truncation error
that is related to the size of the
discarded singular values
based upon truncated singular value decomposition (SVD):
dim
dim
Slide11
Tensor Renormalization Group (TRG)
(Levin, Nave, 2006)
truncated
SVD
initial network
coarser network
truncated SVD
contract
contractSlide12
RG flow in the space of tensors:
steps
scalar
Tensor Renormalization Methods
Tensor
R
enormalization Group (TRG)
First tensor RG approach:
(Levin, Nave, 2006)
Tensor RG approaches borrow from earlier ideas (e.g.
Kadanoff
spin blocking)
Second
R
enormalization Group (SRG)
Tensor Entanglement Filtering Renormalization (TEFR)
Higher Order Tensor Renormalization
Group (HOTRG)
+ many more…
(
Xie
, Jiang,
Weng
, Xiang, 2008)
(
Xie
, Chen, Qin, Zhu, Yang, Xiang, 2012)
(
Gu
, Wen, 2009)
Many improvements and generalizations:
Give improvement in accuracy (e.g. by taking more of the local environment into
a
ccount when truncating) or allow application to higher dimensional systems etc.Slide13
Tensor Renormalization Methods
But previous tensor RG
approaches do not address
all short-ranged
degrees of freedom at each RG
step
Consequences:
they do not give a
proper
RG flow (i.e. gives wrong RG fixed points)
do not give a sustainable
RG flow (at or near criticality)
Tensor Network Renormalization (TNR)
Consequences:
gives a
proper
RG flow (i.e. correct RG fixed points)
gives a sustainable
RG flow (even at or near criticality)
A method of coarse-graining tensor networks that addresses
all short-ranged degrees of freedom
at each RG step
(
Evenbly
, Vidal, arXiv:1412.0732)Slide14
Outline: Tensor Network Renormalization
Introduction
:
Tensor networks and methods based upon tensor renormalization
C
omparison
: Tensor Renormalization
Group (TRG) vs T
ensor Network Renormalization (TNR)
Discussion:
Failure of previous tensor RG methods to address all short-ranged degrees of freedom
Reformulation:
A different prescription for implementing tensor RG methods.
Resolution:
how to build a tensor RG scheme that addresses all short-ranged degrees of freedomSlide15
RG flow in the space of tensors:
Proper RG flow:
Consider
2D classical
Ising
ferromagnet
at temperature T:
Tensor Renormalization Methods
Encode partition function (temp T) as a tensor network:
o
rdered phase
c
ritical point (correlations at all length scales)
d
isordered phase
Phases:
Slide16
disordered phase
Proper RG flow: 2D classical
Ising
should converge to the same (trivial) fixed point, but don’t!
Numerical results, Tensor Network Renormalization (
TNR)
:
converge to the
same fixed point
(containing only information on the
universal
properties of the phase
Numerical results,
T
ensor renormalization group
(TRG):Slide17
sub-critical
ordered (
Z2
) fixed point
critical
critical (
scale-invariant
) fixed point
super-critical
disordered (
trivial
) fixed point
Numerical results for 2D classical
Ising
, Tensor Network Renormalization (
TNR)
:
Converges to one of three RG fixed points, consistent with a
proper RG flow
Proper RG flow: 2D classical
IsingSlide18
RG flow in the space of tensors:
Proper RG flow:
Consider
2D classical
Ising
ferromagnet
at temperature T:
Tensor Renormalization Methods
Encode partition function (temp T) as a tensor network:
o
rdered phase
c
ritical point (correlations at all length scales)
d
isordered phase
Phases:
Slide19
RG flow in the space of tensors:
Sustainable RG flow: 2D classical
Ising
The RG is
sustainable
if is upper bounded by a constant. Is TRG sustainable???
truncated singular value decomposition (SVD):
dim
dim
Key step of TRG:
Let
be the number of singular values (or bond dimension) needed to maintain fixed truncation error
ε
at RG step
s
What is a sustainable RG flow???Slide20
10
-4
10
-3
10
-2
10
-1
10
0
s
s
s
s
Spectrum of
10
2
10
1
10
0
10
2
10
1
10
0
10
2
10
1
10
0
10
2
10
1
10
0
Does TRG give a sustainable RG flow?
TRG:
~10
~20
~40
>100
Bond dimension
required to maintain fixed truncation error (~10
-3
):
Sustainable RG flow: 2D classical
Ising
Computational cost:
TRG, :
Cost of TRG scales exponentially with RG iteration!
RG flow at criticalitySlide21
Does TRG give a sustainable RG flow?
10
-4
10
-3
10
-2
10
-1
10
0
s
s
s
s
Spectrum of
10
2
10
1
10
0
10
2
10
1
10
0
10
2
10
1
10
0
10
2
10
1
10
0
RG flow at criticality
TRG
TNR
Sustainable RG flow: 2D classical
Ising
TNR:
~10
~10
~10
~10
TRG:
~10
~20
~40
>100
Computational costs:
TNR :
TRG, :
Bond dimension
required to maintain fixed truncation error (~10
-3
):
SustainableSlide22
Tensor Renormalization Methods
P
revious RG methods for contracting tensor networks
vs
do not give a
proper
RG flow (
wrong
RG fixed points)
unsustainable
RG flow (at or near criticality)
Tensor Network Renormalization (TNR)
gives a proper RG flow (
correct RG fixed points)
can give a
sustainable RG flow
do not address all
short-ranged degrees of freedom
can address all short-ranged degrees of freedom
Tree tensor network (TTN)
Multi-scale entanglement renormalization
ansatz
(MERA)
Analogous to:
vsSlide23
Tensor Renormalization Methods
P
revious RG methods for contracting tensor networks
vs
do not give a
proper
RG flow (
wrong
RG fixed points)
unsustainable
RG flow (at or near criticality)
Tensor Network Renormalization (TNR)
gives a proper RG flow (
correct RG fixed points)
can give a
sustainable RG flow
do not address all
short-ranged degrees of freedom
can address all short-ranged degrees of freedom
Can we see how TRG fails to address all short-ranged degrees of freedom?
… consider the
fixed points
of TRGSlide24
Outline: Tensor Network Renormalization
Introduction
:
Tensor networks and methods based upon tensor renormalization
C
omparison
: Tensor Renormalization
Group (TRG) vs T
ensor Network Renormalization (TNR)
Discussion:
Failure of previous tensor RG methods to address all short-ranged degrees of freedom
Reformulation:
A different prescription for implementing tensor RG methods.
Resolution:
how to build a tensor RG scheme that addresses all short-ranged degrees of freedomSlide25
Imagine “A” is a special tensor such that each index can be decomposed as a product of smaller indices,
such that certain pairs of indices are perfectly correlated:
These are called
corner double line
(CDL) tensors. CDL tensors are fixed points of TRG.
Partition function built from CDL tensors represents a state with short-ranged correlations
Fixed points of TRGSlide26
new CDL tensor!
Singular value decomposition
Contraction
Some short-ranged always correlations remain under TRG!
Fixed points of TRGSlide27
short-range correlated
I
s there some way to ‘fix’ tensor renormalization such that
all short-ranged
correlations are addressed?
Fixed points of TRG
others are
artificially promoted
to the next length scale
TRG removes some short ranged correlations, but…
Coarse-grained networks always retain some dependence on
the
microscopic (short-ranged) details
Accumulation
of short-ranged correlations causes
computational breakdown
when at / near criticality
TRG
short-range correlatedSlide28
Outline: Tensor Network Renormalization
Introduction
:
Tensor networks and methods based upon tensor renormalization
C
omparison
: Tensor Renormalization
Group (TRG) vs T
ensor Network Renormalization (TNR)
Discussion:
Failure of previous tensor RG methods to address all short-ranged degrees of freedom
Reformulation:
A different prescription for implementing tensor RG methods.
Resolution:
how to build a tensor RG scheme that addresses all short-ranged degrees of freedomSlide29
Reformulation of Tensor RG
Change in formalism:
RG scheme based on SVD decompositions
RG scheme based on insertion of projectors into network
truncated singular value decomposition (SVD):
dim
dim
rank projector
dim
Slide30
Change in formalism:
RG scheme based on SVD decompositions
RG scheme based on insertion of projectors into network
truncated singular value decomposition (SVD):
dim
dim
Want to choose ‘w’ as to minimize error:
set:
Reformulation of Tensor RG
dim
Slide31
Change in formalism:
RG scheme based on SVD decompositions
RG scheme based on insertion of projectors into network
truncated SVD
dim
if
isometry
w
is optimised to act as an
approximate resolution of the identity
, then these are equivalent!
apply projection
dim
approximate decomposition into pair of 3-index tensors
Reformulation of Tensor RGSlide32
Change in formalism:
RG scheme based on SVD decompositions
RG scheme based on insertion of projectors into network
truncated HOSVD
apply projection
could be equivalent decompositions
HOTRG
can also be done with insertion of projectors
Reformulation of Tensor RGSlide33
truncated
SVD
contract
TRG
Equivalent scheme
insert
projectors
contract
Can reduce cost of TRG
Reformulation of Tensor RGSlide34
Insertion of projectors can mimic a matrix decomposition (e.g. SVD)…
…but can also do things that
cannot be done
using a matrix decomposition
Restrict to the case
such that
u
is a unitary
dim
dim
p
rojector that is decomposed as a product of four-index
isometries
Reformulation of Tensor RGSlide35
Insertion of projectors can mimic a matrix decomposition (e.g. SVD)…
…but can also do things that
cannot be done
using a matrix decomposition
dim
exact resolution of the identity
Reformulation of Tensor RG
dim
Slide36
Insertion of projectors can mimic a matrix decomposition (e.g. SVD)…
…but can also do things that
cannot be done
using a matrix decomposition
dim
Tensor network renormalization (TNR)
approach follows from composition of these insertions…
Reformulation of Tensor RG
dim
Slide37
Outline: Tensor Network Renormalization
Introduction
:
Tensor networks and methods based upon tensor renormalization
C
omparison
: Tensor Renormalization
Group (TRG) vs T
ensor Network Renormalization (TNR)
Discussion:
Failure of previous tensor RG methods to address all short-ranged degrees of freedom
Reformulation:
A different prescription for implementing tensor RG methods.
Resolution:
how to build a tensor RG scheme that addresses all short-ranged degrees of freedomSlide38
Insert exact resolutions of the identity
Insert approximate resolutions of the identity
Tensor Network RenormalizationSlide39
Contractions
Contract
Tensor Network RenormalizationSlide40
Contract
Singular value decomposition
Contract
disentanglers
Tensor Network RenormalizationSlide41
Equivalent to TRG
Tensor network Renormalization (TNR)
Tensor Network RenormalizationSlide42
Insert exact resolutions of the identity
Insert approximate resolutions of the identity
Tensor Network Renormalization (TNR):
If the
disentanglers
‘u’ are removed then the TNR approach becomes
equivalent to TRG
I will not here discuss the algorithm required to optimize
disentanglers
‘u’ and
isometries
‘w’
Does TNR address
all short-ranged
degrees of freedom?Slide43
trivial (product) state
TNR
Insert unitary
disentanglers
:
Key step of TNR algorithm:
TRG
short-range correlated
short-range correlated
Tensor Network Renormalization (TNR):
What is the effect of disentangling?
TNR can address all short range degrees of freedom!Slide44
Tensor Network Renormalization (TNR):Tensor Entanglement Filtering Renormalization (TEFR)
(
Gu, Wen, 2009)
An earlier attempt at resolving the problem of accumulation of short-ranged degrees of freedom:
TEFR
can
transform the network of CDL tensors to a trivial network
short-range correlated
trivial (product) state
TNR
trivial (product) state
TEFRSlide45
Tensor Network Renormalization (TNR):More difficult case:
can short-ranged correlations still be removed when correlations at many length scales
are present?
Tensor entanglement filtering renormalization
only works removes correlations from CDL tensors (i.e. systems far from criticality)
TEFR?
No, does not appear so…
Tensor network renormalization
can remove short-ranged correlations
near or at criticality
, and potentially in higher dimension D
TNR?
Yes…
short-range correlated
trivial (product) state
TNR
also TEFR
Removing correlations from CDL fixed point is necessary,
but not sufficient
, to generate a
proper RG flowSlide46
Outline: Tensor Network Renormalization
Introduction
:
Tensor networks and methods based upon tensor renormalization
C
omparison
: Tensor Renormalization
Group (TRG) vs T
ensor Network Renormalization (TNR)
Discussion:
Failure of previous tensor RG methods to address all short-ranged degrees of freedom
Reformulation:
A different prescription for implementing tensor RG methods.
Resolution:
how to build a tensor RG scheme that addresses all short-ranged degrees of freedomSlide47
EnNum.eps
10
-8
10
-7
10
-6
10
-5
10
-4
10
-3
TRG
TRG
TNR
T
C
2
2.1
2.2
2.3
2.4
Temperature,
T
Error in Free Energy,
Benchmark
numerics
:
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
2
2.1
2.2
2.3
2.4
Temperature,
T
T
C
Exact
Spontaneous Magnetization,
TNR
2D classical
Ising
model on lattice of size:
Slide48
RG flow in the space of tensors:
Proper RG flow:
Consider
2D classical
Ising
ferromagnet
at temperature T:
Tensor Renormalization Methods
Encode partition function (temp T) as a tensor network:
o
rdered phase
c
ritical point (correlations at all length scales)
d
isordered phase
Phases:Slide49
disordered phase
Proper RG flow: 2D classical
Ising
Numerical results,
T
ensor renormalization group
(TRG):
Numerical results, Tensor Network Renormalization (
TNR)
:
CDL tensor fixed pointsSlide50
Proper RG flow: 2D classical Ising
more difficult!
TNR bond dimension:Slide51
Proper RG flow: 2D classical Ising
critical point:
TNR bond dimension:Slide52
Proper RG flow: 2D classical Ising
more difficult!
bond dimension:Slide53
Does TRG give a sustainable RG flow?
10
-4
10
-3
10
-2
10
-1
10
0
s
s
s
s
(a)
Spectrum of
10
2
10
1
10
0
10
2
10
1
10
0
10
2
10
1
10
0
10
2
10
1
10
0
RG flow at criticality
TRG
TNR
Sustainable RG flow: 2D classical
Ising
TNR:
~10
~10
~10
~10
TRG:
~10
~20
~40
>100
Computational costs:
TNR :
TRG, :
Bond dimension
required to maintain fixed truncation error (~10
-3
):
Slide54
SummaryWe have introduced an RG based method for contracting tensor networks:
Tensor Network Renormalization (TNR)
key idea:
proper arrangement of isometric ‘w’ and unitary ‘u’ tensors address all short-ranged degrees of freedom at each RG step
…but
higher dimensional generalization of TNR
could still generate a sustainable RG flow
Direct applications to study of
2D classical
and 1D quantum
many-body systems, and for contraction of PEPS
Proper RG flow (gives correct RG fixed points)
Sustainable RG flow (can iterate without increase in cost)
k
ey features of TNR:
Address all short-ranged degrees of freedom
In 2D quantum (or 3D classical) the accumulation of short-ranged degrees of freedom in HOTRG is
much worse
(due to entanglement area law)