| | |
| | |
Stat |
Members: 3643 Articles: 2'488'730 Articles rated: 2609
29 March 2024 |
|
| | | |
|
Article overview
| |
|
MCMC for Variationally Sparse Gaussian Processes | James Hensman
; Alexander G. de G. Matthews
; Maurizio Filippone
; Zoubin Ghahramani
; | Date: |
12 Jun 2015 | Abstract: | Gaussian process (GP) models form a core part of probabilistic machine
learning. Considerable research effort has been made into attacking three
issues with GP models: how to compute efficiently when the number of data is
large; how to approximate the posterior when the likelihood is not Gaussian and
how to estimate covariance function parameter posteriors. This paper
simultaneously addresses these, using a variational approximation to the
posterior which is sparse in support of the function but otherwise free-form.
The result is a Hybrid Monte-Carlo sampling scheme which allows for a
non-Gaussian approximation over the function values and covariance parameters
simultaneously, with efficient computations based on inducing-point sparse GPs.
Code to replicate each experiment in this paper will be available shortly. | Source: | arXiv, 1506.4000 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser claudebot
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |