Jensen–Shannon divergence

From WikiMD's Wellness Encyclopedia

Jensen–Shannon divergence - brief summary


In probability theory and statistics, the Jensen–Shannon divergence is a method of measuring the similarity between two probability distributions. It is also known as information radius (IRad or total divergence to the average. It is based on the Kullback–Leibler divergence, with some notable (and useful differences, including that it is symmetric and it always has a finite value. The square root of the Jensen–Shannon divergence is a metric often referred to as Jensen-Shannon distance.

This article is a stub.

You can help WikiMD by registering to expand it.
Editing is available only to registered and verified users.
WikiMD is a comprehensive, free health & wellness encyclopedia.

Contributors: Prab R. Tumpati, MD