Gaussian Process Regression for Dummies - Description
Greg Cox Richard Shiffrin Continuous response measures The problem What do we do if we do not know the functional form Rasmussen amp Williams Gaussian Processes for Machine Learning httpwwwgaussianprocessesorg ID: 271679 Download Presentation
Paciorek and Mark J Schervish Department of Statistics Carnegie Mellon University Pittsburgh PA 15213 paciorekalumnicmuedumarkstatcmuedu Abstract We introduce a class of nonstationary covariance functions for Gaussian process GP regression Nonstatio
4/18/2017. History Matching Software!. About. history_matching. packing written in Python. LASSO regression for variable selection. Uses . statsmodels. for GLM. Custom GPR with GPU acceleration and cross-validation (.
Lecture. 7. Linear time invariant systems. 1. Random process. 2. 1. st. order Distribution & . density . function. First-order distribution. First-order . density function. 3. 2. end. order Distribution & .
Greg Cox. Richard Shiffrin. Continuous response measures. The problem. What do we do if we do not know the functional form?. Rasmussen & Williams, . Gaussian Processes for Machine Learning. http://www.gaussianprocesses.org/.
Download Presentation - The PPT/PDF document "Gaussian Process Regression for Dummies" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Gaussian Process Regression for Dummies
Presentation on theme: "Gaussian Process Regression for Dummies"— Presentation transcript:
Gaussian Process Regression for Dummies
Continuous response measuresSlide3
What do we do if we do not know the functional form?
Rasmussen & Williams,
Gaussian Processes for Machine Learning
Bayesian linear regressionSlide6
is a collection of random variables, any subset of
jointly normally distributed.
mean and covariance
Gaussian process regression: assume form of mean and covariance among data functional formSlide7
How much does knowledge of one point tell us about another point?Slide8
Returning to linear regression
Mean = Function of parameters
Covariance = Uncertainty about parameters + Observation noiseSlide9
Takeaways from linear regression
Rather than work in “parameter space”, we can bypass it by just working in “data space”
This allows us to worry only about how different data points relate to one another without needing to specify the parameters of the data generating process
The posterior predictive distribution encapsulates our uncertainty about the data generating process
The choice of covariance kernel—which says how different observations inform one another—implies certain properties of the data generating processSlide10
Posterior predictive distribution
So far, we have computed the posterior predictive
the parameters (e.g.,
) of the data generating process. But, a Gaussian process may have an infinite number of parameters (q
). How can we compute the posterior predictive in this case?
to the rescue! Let’s say we don’t know the data generating process, but we assume all observations are drawn from the same Gaussian process (i.e., are multivariate normal) and have an idea about how observations can mutually inform one another, the covariance kernel
New data values
(x*), given observed data f(x):But these are all multivariate normal!Slide11
Building a functionSlide12
A hierarchical Bayesian approachSlide13
The GP modelSlide15
The GP modelSlide17
can indicate important changes in cognitive processingSlide19
Gaussian process models offer a useful and extensible way of dealing with behavioral trajectories
Able to model entire spectrum of dynamics
Can be embedded in a generative model to infer attractors and inflection points
Allow for deeper inferences about underlying cognitive processes