Suppose I have vector $x^t \in \mathbb{R}^n, x_i > 0$ that is a random variable in $t$. I define a measure $D(x) := \max_{i,j} |x_i - x_j|$, which essentially is the maximum discrepancy of any two values in the vector.
A paper I'm currently going through attempts to bound this measure for a certain process. I believe the details are unimportant, except that the sum of all the $x_i$ is $0$ for each $t$.
However, the paper goes on to bound another quantity: The variational distance between vector $x$ and the $0$-vector: $||x|| = \frac{1}{2} \sum_i |x_i|$
According to the Wikipedia article on variational distance, the measure $D(x)$ should be the same as the variational distance, but I don't see how this can be the same.