/
Gaussians Gaussians

Gaussians - PowerPoint Presentation

faustina-dinatale
faustina-dinatale . @faustina-dinatale
Follow
367 views
Uploaded On 2017-12-25

Gaussians - PPT Presentation

Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun Burgard and Fox Probabilistic Robotics TexPoint fonts used in EMF Read the TexPoint manual before you delete this box ID: 617825

variate gaussians gaussian multi gaussians variate multi gaussian examples vector linear univariate algebra multivariate result distribution lots partitioned integrate

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Gaussians" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

GaussiansPieter AbbeelUC Berkeley EECSMany slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

TexPoint fonts used in EMF.

Read the TexPoint manual before you delete this box.:

A

A

A

A

A

A

A

A

A

A

A

A

ASlide2

Univariate GaussianMultivariate GaussianLaw of Total ProbabilityConditioning (Bayes’ rule)Disclaimer: lots of linear algebra in next few lectures. See course homepage for pointers for brushing up your linear algebra. In fact, pretty much all computations with Gaussians will be reduced to linear algebra!OutlineSlide3

Univariate GaussianGaussian distribution with mean , and standard deviation s:Slide4

Densities integrate to one: Mean:Variance:Properties of GaussiansSlide5

Central limit theorem (CLT)Classical CLT:Let X1, X2, … be an infinite sequence of independent random variables with E Xi = , E(Xi - )2 = 2Define Zn = ((X1 + … +

X

n

) – n

) / (

n1/2)Then for the limit of n going to infinity we have that Zn is distributed according to N(0,

1)Crude statement: things that are the result of the addition of lots of small effects tend to become Gaussian.Slide6

Multi-variate GaussiansSlide7

Multi-variate Gaussians

(integral of vector = vector of integrals of each entry)

(integral of matrix = matrix of integrals of each entry) Slide8

 = [1; 0]

= [1 0;

0 1]

= [-.5; 0]

= [1 0; 0 1]

= [-1;

-1.5

]

= [

1

0

; 0 1]

Multi-

variate

Gaussians: examplesSlide9

Multi-variate Gaussians: examples = [0; 0] = [1 0 ; 0 1]

= [0; 0]

= [.6 0 ; 0 .6]

= [0; 0]

= [2 0 ; 0

2

]Slide10

 = [0; 0]

= [1 0; 0 1]

= [0; 0]

= [1 0.5; 0.5 1]

= [0; 0]

= [1 0.8; 0.8 1]

Multi-variate Gaussians: examplesSlide11

 = [0; 0]

= [1 0; 0 1]

= [0; 0]

= [1 0.5; 0.5 1]

= [0; 0]

= [1 0.8; 0.8 1]

Multi-

variate

Gaussians: examplesSlide12

= [0; 0]

= [1 -0.5 ; -0.5 1]

= [0; 0]

= [1 -0.8 ; -0.8 1]

= [0; 0]

= [3 0

.8

; 0.8 1]

Multi-

variate

Gaussians: examplesSlide13

Gaussians

-

s

s

m

Univariate

m

MultivariateSlide14

Partitioned Multivariate Gaussian

Consider a multi-

variate

Gaussian and partition random vector into (X, Y).Slide15

Partitioned Multivariate Gaussian: Dual Representation

Precision matrix

Straightforward to verify from (1) that:

And swapping the roles of

¡

and

§

:

(1)Slide16

Marginalization: p(x) = ?

We integrate out over y to find the marginal:

Hence we have:

Note:

if

we had known beforehand

that p(x) would be a Gaussian distribution, then we could have found the result more quickly. We would have just needed to find and , which we had available throughSlide17

If

Then

Marginalization RecapSlide18

Self-quizSlide19

Conditioning: p(x | Y = y0) = ?

We have

Hence we have:

Mean moved according to correlation and variance on measurement

Covariance

§

XX | Y = y0

does not depend on

y

0Slide20

If

Then

Conditioning Recap