# A quick explanation of Wasserstein Metric

Note that Dembski has uploaded a revised manuscript which now correctly attributes the measure to Renyi and thanks the many critics for their contributions

I am not a mathematician but let me give it a try and others can amend and revise my comments.

The Kantorovich/Wasserstein distance metric is also known under such names as the Dudley, Fortet Mourier, Mallows and is defined as follows.

where $E(x)$ refers to the expectation of the random variable x and $\inf$ means that the minimum is sought on all random variables X which take a distribution F and random variables Y which take a distribution G.

where $\tau_{x,y}$ is the set of all joint distributions of random variables X and Y whose marginal distributions are F and G.

These metrics define a ‘distance’ between two stochastic distributions and are one of many such metrics that have been mathematically defined. There is a good paper on many of these metrics On Choosing and Bounding Probability Metrics. Different circumstances ask for different distance metrics.

These metrics have found applicability in non-linear equations, variational approaches to entropy dissipation, Phase transitions and symmetry breaking in singular diffusion, random walks, Markov processes and many more. Needless to say these metrics are quite commonly applied in a variety of applications. Applications of this metric to Markov processes may be of interest to evolutionary theory.

Another way of looking at this is by assuming one has two samples X and Y of the same size $X=\lbrace x_1,...,x_n \rbrace$ and $Y=\lbrace y_1,...,y_n \rbrace$. The Mallows distance between empirical distributions is

where the minimum is taken over all possible permutations of $\lbrace 1, ..., n \rbrace$

Rachev, S. T. (1984), The Monge-Kantorovich problem on mass transfer and its applications in stochastics, Theor. Probab. Appl., 29, 647-676.

As far as some interesting applications are concerned

Minimal Entropy Probability Paths Between Genome Families