Metric Gaussian Variational Inference

Bayes Forum - Talk

  • Datum: 05.07.2019
  • Uhrzeit: 14:00 - 15:00
  • Vortragende(r): Jakob Knollmueller (MPA)
  • Ort: MPA
  • Raum: Old Lecture Hall 401
  • Gastgeber: MPA
Metric Gaussian Variational Inference
A variational Gaussian approximation of the posterior distribution can be an excellent way to infer posterior quantities. However, to capture all posterior correlations the parametrization of the full covariance is required, which scales quadratic with the problem size. This scaling prohibits full-covariance approximations for large-scale problems. As a solution to this limitation we propose Metric Gaussian Variational Inference (MGVI). This procedure approximates the variational covariance such that it requires no parameters on its own and still provides reliable posterior correlations and uncertainties for all model parameters. We approximate the variational covariance with the inverse Fisher metric, a local estimate of the true posterior uncertainty. This covariance is only stored implicitly and all necessary quantities can be extracted from it by independent samples drawn from the approximating Gaussian. MGVI requires the minimization of a stochastic estimate of the Kullback-Leibler divergence only with respect to the mean of the variational Gaussian, a quantity that scales linearly with the problem size. We motivate the choice of this covariance from an information geometric perspective. We validate the method against established approaches, demonstrate its scalability into the regime over a million parameters and capability to capture posterior distributions over complex models with multiple components and strongly non-Gaussian prior distributions. (arXiv:1901.11033)

Zur Redakteursansicht