Evidence lower bound

In statistics, the evidence lower bound (ELBO, also variational lower bound or negative variational free energy) is the quantity optimized in Variational Bayesian methods. These methods handle cases where a distribution over unobserved variables is optimized as an approximation to the true posterior , given observed data . Then the evidence lower bound is defined as [1]:

where is cross entropy. Maximizing the evidence lower bound minimizes , the Kullback–Leibler divergence a measure of dissimilarity of from the true posterior. The primary reason why this quantity is preferred for optimization is that it can be computed without access to the posterior, given a good choice of .

For other measures of dissimilarity to be optimized to fit see Divergence (statistics)[2].

References

  1. Yang, Xitong. "Understanding the Variational Lower Bound" (PDF). Institute for Advanced Computer Studies. University of Maryland. Retrieved 20 March 2018.
  2. Minka, Thomas (2005), Divergence measures and message passing. (PDF)
  3. Bishop, Christopher M. (2006), "10.1 Variational Inference", Pattern Recognition and Machine Learning (PDF)
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.