126K - views

Gaussian Process Regression for Dummies

Greg Cox. Richard Shiffrin. Continuous response measures. The problem. What do we do if we do not know the functional form?. Rasmussen & Williams, . Gaussian Processes for Machine Learning. http://www.gaussianprocesses.org/.

Embed :
Presentation Download Link

Download Presentation - The PPT/PDF document "Gaussian Process Regression for Dummies" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

Gaussian Process Regression for Dummies






Presentation on theme: "Gaussian Process Regression for Dummies"— Presentation transcript:

Slide1

Gaussian Process Regression for Dummies

Greg Cox

Richard ShiffrinSlide2

Continuous response measuresSlide3

The problem

What do we do if we do not know the functional form?

Rasmussen & Williams,

Gaussian Processes for Machine Learning

http://www.gaussianprocesses.org/Slide4

Linear regressionSlide5

Bayesian linear regressionSlide6

Gaussian processes

A

Gaussian process

is a collection of random variables, any subset of

which is

jointly normally distributed.

Normal

regression:

assume

functional form

mean and covariance

among data

Gaussian process regression: assume form of mean and covariance among data  functional formSlide7

Covariance kernel

How much does knowledge of one point tell us about another point?Slide8

Returning to linear regression

Mean = Function of parameters

Covariance = Uncertainty about parameters + Observation noiseSlide9

Takeaways from linear regression

Rather than work in “parameter space”, we can bypass it by just working in “data space”

This allows us to worry only about how different data points relate to one another without needing to specify the parameters of the data generating process

The posterior predictive distribution encapsulates our uncertainty about the data generating process

The choice of covariance kernel—which says how different observations inform one another—implies certain properties of the data generating processSlide10

Posterior predictive distribution

So far, we have computed the posterior predictive

via

the parameters (e.g.,

b

) of the data generating process. But, a Gaussian process may have an infinite number of parameters (q

). How can we compute the posterior predictive in this case?

The

covariance kernel

to the rescue! Let’s say we don’t know the data generating process, but we assume all observations are drawn from the same Gaussian process (i.e., are multivariate normal) and have an idea about how observations can mutually inform one another, the covariance kernel

k(x, x’)

. Then...

New data values

f*

(x*), given observed data f(x):But these are all multivariate normal!Slide11

Building a functionSlide12

A hierarchical Bayesian approachSlide13

Spivey, Grosjean

, &

Knoblich

, 2005Slide14

The GP modelSlide15

Model structureSlide16

The GP modelSlide17

ResultsSlide18

Results

Inflection points

can indicate important changes in cognitive processingSlide19

Summary

Gaussian process models offer a useful and extensible way of dealing with behavioral trajectories

Able to model entire spectrum of dynamics

Can be embedded in a generative model to infer attractors and inflection points

Allow for deeper inferences about underlying cognitive processes