3
$\begingroup$

I believe the following is true, but I do not know how to prove it.

Given two sets of distinct non-zero complex numbers $z_1, \dots , z_n$ and $w_1, \dots , w_n$, if

$ \begin{bmatrix} z_{1} & \cdots & z_{n} \\ z_{1}^{2} & \cdots & z_{n}^{2} \\ \vdots & \ddots & \vdots \\ z_{1}^n & \cdots & z_{n}^n \end{bmatrix} \begin{bmatrix} 1 \\ 1 \\ \vdots \\ 1 \end{bmatrix} = \begin{bmatrix} w_{1} & \cdots & w_{n} \\ w_{1}^{2} & \cdots & w_{n}^{2} \\ \vdots & \ddots & \vdots \\ w_{1}^n & \cdots & w_{n}^n \end{bmatrix} \begin{bmatrix} 1 \\ 1 \\ \vdots \\ 1 \end{bmatrix} $

then $w_1, \dots , w_n$ is a permutation of $z_1, \dots , z_n$.

Any ideas?


Background

Again given distinct non-zero complex numbers $z_1, \dots , z_n$, consider the matrix equation:

$ \begin{bmatrix} 1 & z_{1} & z_{1}^2 & \cdots & z_{1}^{n-1} \\ 1 & z_{2} & z_{2}^2 & \cdots & z_{2}^{n-1} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & z_{n} & z_{n}^2 & \cdots & z_{n}^{n-1} \end{bmatrix} \begin{bmatrix} a_0 \\ a_1 \\ \vdots \\ a_{n-1} \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix} $

By the fundamental theorem of algebra, this equation can only be satisfied when the $a_1, \dots , a_n$'s are all zero, and so the matrix is non-singular. Since the complex numbers $z_1, \dots , z_n$ are non-zero, the matrix

$ \begin{bmatrix} z_{1} & z_{1}^2 & \cdots & z_{1}^{n} \\ z_{2} & z_{2}^2 & \cdots & z_{2}^{n} \\ \vdots & \vdots & \ddots & \vdots \\ z_{n} & z_{n}^2 & \cdots & z_{n}^{n} \end{bmatrix} $

is also non-singular, and so its columns form a linearly independent basis for $\mathbb{C}^n$. Considered in this light, the claim above compares two bases formed in this way and says, in effect, if the dot-products of each member of the bases with $[ 1,1, \dots ,1]$ are the same, then the two bases are the same.

  • 0
    I played around with induction and got stumped by the inductive step. With the insight provided by the answers below, I now see where the inductive step was headed.2011-02-20

2 Answers 2

3

The result is true. Look at http://en.wikipedia.org/wiki/Symmetric_polynomial and http://en.wikipedia.org/wiki/Newton%27s_identities

A symmetric polynomial in $n$ variables $p(x_1,\dots,x_n)$ is one that does not change under any permutation of the variables.

As explained in the first Wikipedia page linked to above, any symmetric polynomial in $x_1,\dots, x_n$ can be expressed as a polynomial expression with rational coefficients in the power sum symmetric polynomials $p_i(x_1, \dots, x_n)$ for $i=1,\dots,n$, where $p_i(x_1,\dots,x_n)=x_1^i+x_2^i+\dots+x_n^i. $

The point is then that the coefficients of $q(x)=(x-z_1)(x-z_2)\dots(x-z_n)$ are symmetric polynomials of $z_1,\dots,z_n$ and therefore expressible in terms of the $p_i(z_1,\dots,z_n)$. Since these coefficients completely determine $q(x)$, we have the result you are after.

3

Yes, this is true. It follows from Newton's identities, which show that the first $n$ symmetric sums of $n$ numbers determine the coefficients of a polynomial with those numbers as its roots (in characteristic $0$).

A non-recursive formula follows from the exponential formula in combinatorics, which shows that the identity

$\prod_{i=1}^n (1 + tz_i) = \exp \left( p_1 - \frac{p_2 t^2}{2} + \frac{p_3 t^3}{3} \mp ... \right)$

is equivalent to the identity

$e_k = \frac{1}{k!} \sum_{\pi \in S_k} \text{sgn}(\pi) p_{\pi}$

where

  • $p_k = \sum_{i=1}^n z_i^k$,
  • $e_k$ is the $k^{th}$ coefficient of $\prod (1 + tz_i)$,
  • $\text{sgn}(\pi)$ is the sign of the permutation $\pi$, and
  • $p_{\pi} = p_{\lambda_1} ... p_{\lambda_m}$ where the permutation $\pi$ has cycle type $(\lambda_1, ... \lambda_n)$.

A short proof of the first identity is as follows: let $M$ be a matrix whose eigenvalues are $z_1, ... z_n$. Then the LHS is $\det (I + tM)$ and the RHS is $\exp \text{tr } \log (1 + tM)$. But the identity $\det \exp N = \exp \text{tr } N$ where $N$ is a matrix is straightforward to prove (e.g. by proving it for $N$ diagonalizable and extending by continuity).