Finding the coordinates of points from distance matrix

I have a set of points (with unknown coordinates) and the distance matrix. I need to find the coordinates of these points in order to plot them and show the solution of my algorithm.

I can set one of these points in the coordinate (0,0) to simplify, and find the others. Can anyone tell me if it's possible to find the coordinates of the other points, and if yes, how?

Thanks in advance!

EDIT Forgot to say that I need the coordinates on x-y only


Doing this with angles, as Jyrki suggested, is cumbersome and difficult to generalize to different dimensions. Here is an answer that's essentially a generalization of WimC's, which also fixes an error in his answer. In the end, I show why this works, since the proof is simple and nice.

The algorithm

Given a distance matrix $D_{ij}$, define $$M_{ij} = \frac {D^2_{1j}+D^2_{i1}-D^2_{ij}} 2 \,.$$ One thing that is good to know in case the dimensionality of the data that generated the distance matrix is not known is that the smallest (Euclidean) dimension in which the points can be embedded is given by the rank $k$ of the matrix $M$. No embedding is possible if $M$ is not positive semi-definite.

The coordinates of the points can now be obtained by eigenvalue decomposition: if we write $M = USU^T$, then the matrix $X = U \sqrt S$ (you can take the square root element by element) gives the positions of the points (each row corresponding to one point). Note that, if the data points can be embedded in $k$-dimensional space, only $k$ columns of $X$ will be non-zero (corresponding to $k$ non-zero eigenvalues of $M$).

Why does this work?

If $D$ comes from distances between points, then there are $\mathbf x_i \in \mathbb R^m$ such that $$D_{ij}^2 = (\mathbf x_i - \mathbf x_j)^2 = \mathbf x_i^2 + \mathbf x_j^2 - 2\mathbf x_i \cdot \mathbf x_j \,.$$ Then the matrix $M$ defined above takes on a particularly nice form: $$M_{ij} = (\mathbf x_i - \mathbf x_1) \cdot (\mathbf x_j - \mathbf x_1) \equiv \sum_{a=1}^m \tilde x_{ia} \tilde x_{ja}\,,$$ where the elements $\tilde x_{ia} = x_{ia} - x_{1a}$ can be assembled into an $n \times m$ matrix $\tilde X$. In matrix form, $$M = \tilde X \tilde X^T \,.$$ Such a matrix is called a Gram matrix. Since the original vectors were given in $m$ dimensions, the rank of $M$ is at most $m$ (assuming $m \le n$).

The points we get by the eigenvalue decomposition described above need not exactly match the points that were put into the calculation of the distance matrix. However, they can be obtained from them by a rotation and a translation. This can be proved for example by doing a singular value decomposition of $\tilde X$, and showing that if $\tilde X \tilde X^T = X X^T$ (where $X$ can be obtained from the eigenvalue decomposition, as above, $X = U\sqrt S$), then $X$ must be the same as $\tilde X$ up to an orthogonal transformation.


I assume this is in $\mathbb{R}^2$. You can compute possible points $v_j \in \mathbb{R}^2$ as follows. First compute the Gram matrix $M$ from the distance matrix $D$:

$$M_{i,j} = \frac{(D_{1,i})^2 + (D_{1, j})^2 - (D_{i, j})^2}{2}$$

for $i, j \in \{1,\dotsc, n\}$. (Note that the first row and column of $M$ consist of zeroes.) This is a positive semi-definite matrix of rank at most two. I'll assume the generic case where the rank is two (not all points on a single line).

Find an orthonormal basis $\{b_1, b_2\}$ for the column space of $M$ (e.g. by applying Gram-Schmidt). Let $m_j$ denote the $j$-th column of $M$. Then take $v_j = (\langle b_1, m_j \rangle, \langle b_2, m_j \rangle) \in \mathbb{R}^2$ where $\langle \cdot, \cdot \rangle$ is the euclidean inner product on $\mathbb{R}^n$. The points $v_j$ will satisfy your distance matrix.