/
Multilevel Monte Carlo  Metamodeling Multilevel Monte Carlo  Metamodeling

Multilevel Monte Carlo Metamodeling - PowerPoint Presentation

test
test . @test
Follow
362 views
Uploaded On 2018-09-22

Multilevel Monte Carlo Metamodeling - PPT Presentation

Imry Rosenbaum Jeremy Staum Outline What is simulation metamodeling Metamodeling approaches Why use function approximation Multilevel Monte Carlo MLMC in metamodeling Simulation ID: 675860

multilevel cost level monte cost multilevel monte level carlo simulation estimator variance estimate metamodeling approximation mlmc square continued smoothness

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Multilevel Monte Carlo Metamodeling" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Multilevel Monte Carlo Metamodeling

Imry

Rosenbaum

Jeremy

StaumSlide2

OutlineWhat is simulation metamodeling?

Metamodeling

approaches

Why use function approximation?

Multilevel Monte Carlo

MLMC in

metamodelingSlide3

Simulation Metamodelling

Simulation

Given input

we observe

.

Each observation is noisy.Effort is measured by number of observations, .We use simulation output to estimate the response surface .Simulation MetamodellingFast estimate of given any .“what does the response surface look like?”

 Slide4

Why do we need MetamodelingWhat-if analysis How things will change for different scenarios .

Applicable in financial, business and military settings.

For example

Multi-product asset portfolios.

How product mix will change our business profit.Slide5

ApproachesRegressionInterpolationKriging

Stochastic

Kriging

Kernel SmoothingSlide6

Metamodeling as Function ApproximationMetamodeling is essentially function approximation under uncertainty.

Information Based Complexity has answers for such settings.

One of those answers is Multilevel Monte Carlo.Slide7

Multilevel Monte Carlo

Multilevel Monte Carlo has been suggested as a numerical method for parametric integration.

Later the notion was extended to SDEs.

In our work we extend the multilevel notion to stochastic simulation

metamodeling

.Slide8

Multilevel Monte Carlo

In 1998 Stefan Heinrich introduced the notion of multilevel MC.

The scheme reduces the computational cost of estimating a family of integrals.

We use the smoothness of the underlying function in order to enhance our estimate of the integral.Slide9

Example

Let us consider

and we want to compute

For all

.

We will fix a grid

, estimate the respective integrals and interpolate.

 Slide10

Example Continued

We will use piecewise linear approximation

Where

are the respective hat functions and

are Monte Carlo estimate,

i.e

,

.

are

iid

uniform random variables.

 Slide11

Example Continued

Let us use the root

mean square

norm as metric for error

It can be shown that under our assumption of smoothness that

at the cost of

.

 Slide12

Example Continued

Let us consider a sequence of grids

.

We could represent our estimator as

.

Where,

is the estimation using the

grid.

We define each one of our decision variables in terms of M, as to keep a fair comparison.

 Slide13

Example Continued

level

0

L

Square root of variance

Cost

level

0

L

Square root of variance

Cost

The variance reaches its maximum in the first level but the cost reaches its maximum in the last level.Slide14

Example Continued

Let us now use a different number of observations in each level,

thus the estimator will be

We will use

to balance between cost and variance.

 Slide15

Example Continued

level

0

L

Square root of variance

Cost

level

0

L

Square root of variance

Cost

It follows that the square root of the variance is

while the cost is

.

Previously, same variance at the cost of

.

 Slide16

GeneralizationLet

and

be bounded

open sets

with

Lipschitz boundary.We assume the Sobolev embedding condition

.

 Slide17

General Thm

Theorem 1 (Heinrich)

. Let

Then there exist constants

such that for each integer

there is a choice of parameters such that the cost of computing

is bounded by

and for each

with

 Slide18

IssuesMLMC requires smoothness to work, but can we guarantee such smoothness?Moreover, the more dimensions we have the more smoothness that we will require.

Is there a setting that will help with alleviating these concerns?Slide19

AnswerThe answer to our question came from the derivative estimation setting in Monte Carlo simulation.Derivative Estimation is mainly used in finance to estimate the Greeks of financial derivatives.

Glasserman

and

Broadie

presented a framework under which a

pathwise estimator is unbiased.This framework will be suitable as well in our case.Slide20

Simulation MLMCGoalFrameworkMulti Level Monte Carlo Method

Computational Complexity

Algorithm

ResultsSlide21

GoalOur goal is to estimate the response surface

The aim is to minimize the total number of observations used for the estimator.

Effort is relative to amount of precision we require.

 Slide22

Elements We will Need for the MLMCSmoothness provided us with the information how adjacent points behave.Our assumptions on the function will provide the same information.

The choice of approximation and grid will allow to preserve this properties in the estimator.Slide23

The frameworkFirst we assume that our simulation output is a Holder continuous function of a random vector

,

Therefore, there exist

and

such that

for all

in

 Slide24

Framework Continued…Next we assume that there exist a random variable,

with a finite second moment such that

for all

a.s.

Furthermore, we assume that

and that it is compact.

 Slide25

Behavior of Adjacnt Points

bias of estimating

using

is

It follows immediately that,

 Slide26

Multi Level Monte CarloLet us assume that we have a sequence of grids

with increasing number of points

The experiment designed are structured such that the maximum distance between a point

and point in the experiment design is

, denoted by

.Let denote an approximation of

using the same

at each design point.

 Slide27

Approximating the ResponseSlide28

MLMC DecompositionLet us rewrite the expectation of our approximation in the multilevel way

.

Let us define the estimator of

using m observations,

.

 Slide29

MLMC Decomposition ContinuedNext we can write the estimator in the multilevel decomposition,

Do we really have to use the same

for all levels?

 Slide30

The MLMC estimatorWe will denote the MLMC estimator as

Where

 Slide31

Multilevel Illustration

 Slide32

Multi Level MC estimatorsLet us denote

We want to consider approximation of the form of

 Slide33

Approximation Reqierments

We assume that for each

there exist a window size

>

) which is

. Such that for each, we have

and for each

 Slide34

Bias and Variance of the ApproximationUnder these assumptions we can show that

Our measure of error is Mean Integrated Square Error

Next, we can use a theorem provided by

Cliffe

et al. to bound the computational complexity of the MLMC.

 Slide35

Computational Complexity Theorem

Theorem.

Let

denote a simulation response surface and

, an estimator of it using replications for each design point. Suppose there exist

such that

,

and

The computational cost of

is bounded by

 Slide36

Theorem Continued…

Then for every

there exist values of

and

for which the MSE of the MLMC estimator

is bounded by with a total computation cost of

 Slide37

Multilevel Monte Carlo AlgorithmThe theoretical results need translation into practical settings.Out of simplicity we consider only the

Lipschitz

continuous setting.Slide38

Simplifying AssumptionsThe constants

and

stated in the theorem are crucial in deciding when to stop. However, in practice they will not be known to us.

If

we can deduce that

.

 Slide39

Simplifying Assumptions ContinuedHence, we can use

as a pessimistic estimate of the bias at level

. Thus, we will continue adding level until the following criterion is met

However, due to its inherent variance we would recommend using the following stopping criteria

 Slide40

The algorithmSlide41

Black-ScholesSlide42

Black-Scholes continuedSlide43

ConclusionMultilevel Monte Carlo provides an efficient metamodeling scheme.

We eliminated the necessity for increased smoothness when dimension increase.

Introduced a practical MLMC algorithm for stochastic simulation

metamodeling

.Slide44

Questions?