What is Kullback-Leibler (KL) Divergence?

Kullback-Leibler Divergence (KL Divergence)

The Kullback-Leibler Divergence metric is calculated as the difference between one probability distribution from a reference probability distribution. KL divergence is sometimes referred to as ‘relative entropy’ and best used when one distribution is much smaller in sample and has a large variance.

KL Divergence graphic

Sign up for our monthly newsletter, The Drift.

Subscribe