Review

From GAN to WGAN

1. KL (Kullback-Leibler) Divergence

KL divergence measures how one probability distribution $p$ diverges from a second expected probability distribution $q$

$$ D_{KL}(p||q) = \int_x p(x)\log \frac{p(x)}{q(x)}dx $$

2. Jensen-Shannon Divergence

JS Divergence measures the similarity between two probability distributions bounded by $[0,1]$.

$$ D_{JS}(p||q) = \frac{1}{2}D_{KL}(p||\frac{p+q}{2})+\frac{1}{2}D_{KL}(q||\frac{p+q}{2}) $$

Untitled