Q9 – Variational Lower Bound

Contributed by Matthew Zak.

  1. Using the property of a concave function: \log(\sum_i\alpha_ix_i)\geq\alpha_i\log(x_i) show that \log(p(x))\geq\sum_hg(h \mid x)\log(p(x,h))-\sum_hq(h\mid x)\log(q(h\mid x)) where p(x,y) describes the true conditional probability distribution over x using a latent variable h (p(x) equals \sum_hp(x, h)) and q(h\mid x) is an approximation of p(h\mid x).
  2. Show that if q(h\mid x) = p(h\mid x) and p(x,h) = p(h \mid x)*p(x) then the variational lower bound (given by second equation) equals a logarithm of true conditional distribution \log(p(x))
  3. Knowing that the difference between variational lower bound and the data distribution is given by KL_(q||p)=\sum_h q(h\mid x)\log(\frac{q(h\mid x)}{p(h\mid x)}) , prove the limit of the KL divergence is 0 as q(h\mid x) approaches p(h\mid x).
Advertisements

3 thoughts on “Q9 – Variational Lower Bound

    • Yes, there is a sum on the right term as well.
      And I think that posting your answers is one of the purposes of this Q/A stuff

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s