Jensen-Shannon divergence

The Jensen-Shannon divergence is a symmetric version of the Kullback-Leibler divergence.  Unlike the KL-divergence, the square root of the JS-divergence is a true metric obeying the triangle inequality.  Interestingly, if $X$ is a random variable chosen from the average of two distributions $Q$ and $R$, then the J-S divergence between $Q$ and $R$ is equal to the mutual information between $X$ and the indicator function $I$, where $P(I=1) = q(X)/( q(X) + r(X) )$, $q(x)$ is the density of $Q$, and $r(x)$ is the density of $R$.  (One immediate consequence is the JS divergence is always less than 1.)


  1. Prodie’s avatar

    Funny… I was just talking to fatso about mutual information.

  2. hundalhh’s avatar

    Fatso and I will probably be at NIPS. There’s a chance I will be in LA before that. – Hein

Comments are now closed.