Jensen–Shannon divergence

In probability theory and statistics, the Jensen-Shannon divergence is a popular method of measuring the similarity between two probability distributions.

Definition
Consider the set $$M_+^1(A)$$ of probability distributions where A is a set provided with some σ-algebra.

Jensen-Shannon divergence (JSD) $$M_+^1(A) \times M_+^1(A) \rightarrow [0,\infty]$$ is a symmetrized and smoothed version of the Kullback-Leibler divergence$$D(P \parallel Q)$$. It is defined by

$$JSD(P \parallel Q)= \frac{1}{2}D(P \parallel M)+\frac{1}{2}D(Q \parallel M)$$

where $$M=\frac{1}{2}(P+Q)$$