KL divergence measures how one probability distribution $p$ diverges from a second expected probability distribution $q$
$$ D_{KL}(p||q) = \int_x p(x)\log \frac{p(x)}{q(x)}dx $$
JS Divergence measures the similarity between two probability distributions bounded by $[0,1]$.
$$ D_{JS}(p||q) = \frac{1}{2}D_{KL}(p||\frac{p+q}{2})+\frac{1}{2}D_{KL}(q||\frac{p+q}{2}) $$
