Probabilistic metric space

A probabilistic metric space is a generalization of metric spaces where the distance has no longer values in non-negative real numbers, but in distribution functions.

Let D+ be the set of all probability distribution functions F such that F(0) = 0 (F is a nondecreasing, left continuous mapping from into [0, 1] such that max(F) = 1).

The ordered pair (S,F) is said to be a probabilistic metric space if S is a nonempty set and F: S×S → D+ (F(p, q) is denoted by Fp,q for every (p, q) ∈ S × S) satisfies the following conditions:

  • Fu,v(x) = 1 for every x > 0 ⇔ u = v (u, v ∈ S).
  • Fu,v = Fv,u for every u, v ∈ S.
  • Fu,v(x) = 1 and Fv,w(y) = 1 ⇒ Fu,w(x + y) = 1 for u, v, w ∈ S and x, y ∈ R+.

Probability metric of random variables

A probability metric D between two random variables X and Y may be defined e.g. as:

where F(x, y) denotes the joint probability density function of random variables X and Y. Obviously if X and Y are independent from each other the equation above transforms into:

where f(x) and g(y) are probability density functions of X and Y respectively.

One may easily show that such probability metrics do not satisfy the first metric axiom or satisfies it if, and only if, both of arguments X and Y are certain events described by Dirac delta density probability distribution functions. In this case:

the probability metric simply transforms into the metric between expected values , of the variables X and Y.

For all other random variables X, Y the probability metric does not satisfy the identity of indiscernibles condition required to be satisfied by the metric of the metric space, that is:

Probability metric between two random variables X and Y, both having normal distributions and the same standard deviation (beginning with the bottom curve). denotes a distance between means of X and Y.

Example

For example if both probability distribution functions of random variables X and Y are normal distributions (N) having the same standard deviation , integrating yields to:

where:

,

and is the complementary error function.

In this case:

Probability metric of random vectors

The probability metric of random variables may be extended into metric D(X, Y) of random vectors X, Y by substituting with any metric operator d(x,y):

where F(X, Y) is the joint probability density function of random vectors X and Y. For example substituting d(x,y) with Euclidean metric and providing the vectors X and Y are mutually independent would yield to:

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.