0
$\begingroup$

Let's say I have a complete weighted graph with $n$ nodes and strictly positive weights. I'd like to find the smallest $d\in\mathbb{N}$ such that there exists a set of $n$ points in $x_n\in\mathbb{R}^d$ with Euclidean distance equal to the corresponding edge weights.

A couple questions:

Is there a $d$ bounded by some constant for every $n$, or does it depend explicitly on the size of n?

What is the complexity class of this problem?

So I am faced with solving a system of $\frac{n(n-1)}{2}$ multivariate quadratic equations, right? What's the best way to tackle this? I have read a bit that maybe resultant elimination is applicable, but I am not exactly sure how to apply it. Ideas?

Update:

Okay, so I found another necessary condition based on Joseph's answer. All but one of the eigenvalues of a Euclidean distance matrix are negative.

3 Answers 3

0

Try doing a search using the string "distance matrix" With various definitions about what it takes to be a distance there is a literature about isometric embeddings of "distances" which obey various rules. In particular, in phylogenetics there is an interest in knowing when a collection of distances arise from a tree or a weighted tree.

  • 0
    A lot of what I have read so far takes "distance matrix" to mean any edge weight matrix that satisfies the properties of a metric. But I did find some interesting and pertinent facts about the spectrum of such matrices.2011-09-11
2

This doesn't necessarily have a solution. Consider the triangle inequality.

However, assuming the weights obey the triangle inequality, the answer is $d = n - 1$. The algorithm adds nodes one at a time. After $k$ nodes are placed in $R^{k-1}$, the $(k+1)$st node lies at the intersection of $k$ $(k-1)$-dimensional spheres in $R^k$, which generically is a pair of disrete points (and finding such a point is not too difficult). Adding more dimensions doesn't help you any, because you only care about the displacement relative to the hyperplane the previous $k$ nodes lie on.

Not sure if there are other necessary relations in addition to the triangle inequality, but I don't think so.

  • 0
    I ca$n$not state any necessary conditions off the top of my head, but surely triangle inequality is not sufficient. See my answer for a counterexample.2011-09-08
2

This does not answer the question fully, but is too long for a comment. I'll try to add more to the answer. Also the answer is a bit rough; it does not explain most of the terms. I will polish it when I get some free time.

First of all, the weighted graph must be a metric; i.e., it should satisfy symmetry ($D(u,v) = D(v,u)$), nonnegativity ($D(u,v) \geq 0$ with equality iff $u=v$), and triangle inequality ($D(u,v) \leq D(u,w)+D(w,v)$). The way the problem is set up, symmetry and nonnegativity come for free. So it suffices to check for triangle inequality (this can be done by brute force if $n$ is not too large).

Technically, the question you are asking can be phrased as:

Which metrics can be isometrically embedded in $\ell^2$?

My first answer is: not all. To see this, take a graph with $10$ vertices (say) with weights: $D(x,y)$ is $1$ if either $x$ or $y$ is the special vertex $v_1$; otherwise it is $2$. (This is known a Greek/British airline metric.) It is a fun exercise to show that this cannot be embedded in $\mathbb R^d$ for any dimension $d$.

I have to dig up a bit for a general criterion to decide this question.

Meanwhile, suppose that I told you that a particular metric can indeed be embedded in $\mathbb R^d$ for some $d$, possibly much larger than $n$. Then, somewhat remarkably, one can show that you can embed the $n$ points in $\mathbb R^n$ itself. This is ultimately related to the fact that every $n \times n$ Gram matrix $G$ has a representation $v_1, v_2, \ldots v_n \in \mathbb R^n$ (notice that the vectors live in an $n$ dimensional space) such that $G_{i,j} = \langle v_i, v_j \rangle$. Unfortunately, however, in many scenarios of interest, $n$ could be much larger than the minimum dimension $d$, so this statement by itself is not always useful.

Added: The "correct" upper bound here is $d = n-1$. The discrete metric on $n$ points cannot be embedded in less than $n-1$ dimensions, so this is the best possible bound, if we want a universal upper bound.

When I can get to this, I would like to add any known criteria to compute the minimum dimension needed. Also, possibly mention something about dimensionality reduction, aka Johnson-Lindenstrauss.

  • 0
    I am accepting your answer because it is the most complete, and I doubt I will get a complete list of necessary and sufficient conditions for this question.2011-09-22