19
$\begingroup$

It is well known that the symmetric property is $d(x,y)=d(y,x)$ is not necessary in the definition of distance if the triangle inequality is carefully stated. On the other hand there are examples of functions satifying

(1) $d(x,y)\geq 0$ and $d(x,y)=0$ if and only if $x=y$

(2) $d(x,y)\leq d(x,z)+d(z,y)$

which are not symmetric: in the three point space $(a,b,c)$ take the non-zero values of $d$ as $1=d(a,b)=d(c,b)$, $2=d(b,a)=d(b,c)=d(a,c)=d(c,a)$.

Do you know other examples of "non symmetric distances"? Are there examples on the real numbers, etc.? Are there examples of spaces were every function satisfying (1) and (2) is symmetric?

  • 0
    I lived in a part of Toronto where most of the residential streets were one-way ( which tended to channel most of the traffic onto the 2-way main streets). So the minimum driving distance $d(x,y)$ from $x$ to $y$ was often not equal to $d(y,x).$2018-08-30

9 Answers 9

23

A classic example is the circle $S^1$ with metric the length of the shortest clockwise path between $x$ and $y$, but let me say some things in general.

Lawvere once made the point that this is really a much more natural definition of a metric space, since it allows them to be interpreted as a type of enriched category. The triangle inequality then becomes a consequence of composition of morphisms, which is extremely reasonable if you think of distances in a metric space as measuring "the best way to get from $a$ to $b$": clearly the best way to get from $a$ to $b$ to $c$ is at most as good at the best way to get from $a$ to $c$, and this is precisely the (asymmetric) triangle inequality.

On the other hand, there is no reason in general that the best way to get from $a$ to $b$ has to look anything like the best way to get from $b$ to $a$. This is the sense in which the symmetry requirement is unnatural. There is also no reason in general that it should be possible to get from $a$ to $b$ at all! This is the sense in which the requirement that distances be finite is unnatural. Finally, there is no reason in general that it should not be possible to instantaneously get from $a$ to $b$ (in other words, that $a$ and $b$ be isomorphic); this is the sense in which the requirement that distances be positive-definite is unnatural.

Here is how to use that idea to generate a large class of examples. Let $(M, d)$ be a metric space and let $h : M \to \mathbb{R}$ be a function. Define a new metric d'(x, y) = \begin{cases} d(x, y) + h(y) - h(x) \text{ if } h(y) \ge h(x) \\\ d(x, y) \text{ otherwise} \end{cases}.

Intuitively, $h$ is a potential (e.g. a height if one is thinking of gravitational potential), and the new metric d' penalizes you for going against the potential (e.g. going uphill).

The directed graph example given by mjqxxxx is also a good illustration of this philosophy about metric spaces.

  • 1
    In the third paragraph, if your geometric intuition is getting in the way pretend that a and b are not points in some space but possible states of a physical system and d(a, b) measures the minimal amount of energy that needs to be put into the system to get it from state a to state b.2011-02-23
10

Most distances in real life are going to be more or less asymmetric due to one-way roads, going uphill resp. downhill, different public transportation schedules, congestion, etc.

  • 0
    and don't forget friendship closeness. I might not be my close friend's close friend.2018-07-05
8

The Hausdorff distance is a symmetric version of a natural non-symmetric distance.

6

For any directed graph $G=(V,E)$, we may define a non-symmetric distance as follows: for $x,y\in V$, the directed graph distance is the length of the shortest directed path from $x$ to $y$, or $\infty$ if there is no such path. This clearly satisfies the triangle inequality as stated above. An even more general family of examples is generated by allowing a positive weight, in which case we may take the distance to be the smallest weight of any directed path from $x$ to $y$ (or, if $G$ is infinite, the greatest lower bound of all such weights; in this case we must additionally require all edge weights to be bounded away from zero).

3

This may or may not be what you're looking for, but I thought the idea was fun, so I'll share:

This will be a non-symmetric metric on $\mathbb{Z}^+$:

Suppose that one is only allowed to move left one unit, and in addition, one is allowed to jump right from one power of two to the next. I.e., to find a sequence of steps to get from 5 to 11, one must travel as follows:

$ 5 \to 4 \to8 \to16 \to15 \to 14 \to13 \to12\to11. $

For a (positive) integer $n$, define $T(n)$ to be largest power of $2$ which is less than or equal to n. Similarly, define $S(n)$ to be the smallest power of $2$ greater than or equal to $n$. Then, if $|\cdot|$ denotes the usual absolute value, we have:

$ d(x,y) = \left\{ \begin{array}{lcc} |x - y| & & x \geq y \\ |x - T(x)| + \log_2(S(y)/T(x)) + |S(y) - y| & & x < y \end{array} \right. $

If $x \leq y$, one just moves left $|y - x|$ units. If not, one moves from $x$ to the largest power of $x$ less than or equal to $x$. One then jumps $\log_2(S(y)) - \log_2(T(x))$ powers of $2$, and finally moves the remaining $|S(y) - y|$ units. It is not hard to show that this a metric (It clear satisfies condition (i), and to show it satisfies (ii), one just needs to consider the case of $d(x,y)$ when x > y).

This metric came to mind in lieu of Cauchy's Proof of the Arithmetic-Geometric mean inequality (http://en.wikipedia.org/wiki/Inequality_of_arithmetic_and_geometric_means#Proof_by_Cauchy).

Let $P(n)$ denote the statement that the AG-mean inequality is true for $n$ numbers. Cauchy's idea was to prove the inequality by first inducting along powers of two (i.e., P($2^k$) is true implies P($2^{k+1}$) is true. Then, he showed that $P(n)$ is true implies $P(n-1)$ is true.

Assuming that we know the arithmetic-geometric mean inequality is true for some step $n$, $d(n,m)$ is the number of steps needed using Cauchy's proof to show $P(m)$ is true, assuming that we know $P(n)$ is true.

2

What about the following distance function on the reals?

$\begin{array}{rcl} d(x,y) & = 1, & \mbox{if} \;\; xy \\ & = 0, & \mbox{if} \;\; x=y \end{array}$

A bit artificial, but it seems to fulfill the requirements, unless I overlooked something.

  • 0
    If you want$a$generalization, see Qiaochu's answer. I think it is more deserving of being attributed the tick mark.2011-02-23
1

The Kullback-Leibler divergence is another example too. Defined between two probability distributions $p(x)$ & $q(x)$ it takes the form, $ D_{KL}(p||q) = \int p(x)\log\left(\frac{p(x)}{q(x)}\right)dx$ It can easily be seen that $D_{KL}(p||q)\neq D_{KL}(q||p)$$.

0

$d(x,y)=\begin{cases}x-y, \mbox{ if } x\geq y\\1, \mbox{ otherwise }\end{cases}$ will do!