The Jensen-Shannon divergence is a symmetric version of the Kullback-Leibler divergence. Unlike the KL-divergence, the square root of the JS-divergence is a true metric obeying the triangle inequality. Interestingly, if $X$ is a random variable chosen from the average of two distributions $Q$ and $R$, then the J-S divergence between $Q$ and $R$ is equal to the mutual information between $X$ and the indicator function $I$, where $P(I=1) = q(X)/( q(X) + r(X) )$, $q(x)$ is the density of $Q$, and $r(x)$ is the density of $R$. (One immediate consequence is the JS divergence is always less than 1.)
Related Posts via Categories
-
Funny… I was just talking to fatso about mutual information.
Comments are now closed.
2 comments