0
$\begingroup$

I'm trying to understand the effect of certain operations on a matrix on eigenvectors and eigenvalues.

Let $X$ be a square matrix, I need to understand how eigenvectors change if;

  1. Each column of $X$ is normalized, such that $\hat{x}_{ij} = \frac{x_{ij}}{\Sigma_i x_{ij}}$ which means that each term is divided by the sum of its column, thus making every column total to 1.
  2. Off-diagonal terms are multiplied by $-1$.
  3. A diagonal matrix $D$ is subtracted from $X$

My main purpose is to understand how various dimensionality reduction techniques (such as Laplacian Eigenmaps, Multidimensional scaling, PCA etc.) differ from each other by gaining intuition about the effect of these operations on matrices. I've looked for books, articles on such theoretical results but could not find any.

I appreciate any comments/articles/books on this problem.

Thanks,

  • 0
    I'm currently trying to m$a$ke sense of the eigenvectors I've collected for several matrices and will share my results if I happen to discover anything promising.2012-04-12

1 Answers 1

1

None of the operations 1,2,3 have simple effect eigenvalues/eigenvectors, in general (already noted by leonbloy). But there is a useful special case of 3. If we subtract a scalar matrix $\alpha I$ from $X$, the eigenvectors remain the same, and every eigenvalue $\lambda$ is replaced by $\lambda-\alpha$.