The variational Gaussian process (VGP), a Bayesian nonparametric model which adapts its shape to match com- plex posterior distributions. The VGP generates approximate posterior samples by generating latent inputs and warping them through random non-linear mappings; the distribution over random mappings is learned during inference, enabling the transformed outputs to adapt to varying complexity.
1. Variational Gaussian
Process
Tran Quoc Hoan
@k09hthaduonght.wordpress.com/
10 February 2016, Paper Alert, Hasegawa lab., Tokyo
The University of Tokyo
Dustin Tran, Rajesh Ranganath, David M.Blei
ICLR 2016
3. 3
Summary
• Deep generative models provide complex representation
of data
• Variational inference methods require a rich family of
approximating distribution
• They develop a powerful
variational model - the variational
Gaussian process (VGP)
• They prove a universal approximation theorem: the VGP
can capture any continuous posterior distribution.
• They derive an efficient black box algorithm.
4. 4
Variational Models
• We want to compute posterior p(z|x) (z: latent variables, x: data)
• Variational inference seeks to minimize
for a family q(z; )
KL(q(z; )||p(z|x))
• Maximizing evidence lower bound (ELBO)
log p(x) Eq(z; )[log p(x|z)] KL(q(z; )||p(z))
• (Common) Mean-field distribution q(z; ) =
Y
i
q(zi; i)
• Hierarchical variational models
• (Newer) Interpret the family as a variational model for posterior
latent variables z (introducing new latent variables)[1]
Lawrence, N. (2000). Variational Inference in Probabilistic Models. PhD thesis.