Wednesday, June 1, 2011

Difference between Kullback–Leibler (KL) and Kolmogorov-Smirnov (KS) distance

A nice excerpt from a site named stack exchange:
The KL-divergence is typically used in information-theoretic settings, or even Bayesian settings, to measure the information change between distributions before and after applying some inference, for example. It's not a distance in the typical (metric) sense, because of lack of symmetry and triangle inequality, and so it's used in places where the directionality is meaningful.
The KS-distance is typically used in the context of a non-parametric test. In fact, I've rarely seen it used as a generic "distance between distributions", where the 1 distance, the Jensen-Shannon distance, and other distances are more common.

0 comments:

Post a Comment