5
$\begingroup$

I have a problem concerning the orthogonalization of a coordinate system; this is necessary in the context of a normal mode analysis of molecular vibrations. I am working on H2O, giving me a 9-dimensional vector space, with six (orthogonal) basis vectors predetermined by describing rotational and translational motion of the entire molecule. I want to determine the three remaining vectors by a modified Gram-Schmidt process, but in my case, this somehow fails due to G-S constructing a zero vector.

As far as I understand, zero vectors from Gram-Schmidt may occur if there is linear dependency somewhere in my set of vectors, but given that my six vectors are mutually orthogonal I don't know how this might be the case (let alone how I could avoid it).

The six predetermined vectors are:

trans-x   trans-y   trans-z   rot-xx    rot-yy    rot-zz 3.9994         0         0         0    0.2552         0      0    3.9994         0   -0.2552         0         0      0         0    3.9994         0         0         0 1.0039         0         0         0   -0.5084   -0.7839      0    1.0039         0    0.5084         0         0      0         0    1.0039    0.7839         0         0 1.0039         0         0         0   -0.5084    0.7839      0    1.0039         0    0.5084         0         0      0         0    1.0039   -0.7839         0         0 

Can you see where the problem lies? I've been looking over this for a few days now, including trying alternative approaches at the orthogonalization problem, and I am starting to get frustrated. Given that my Gram-Schmidt algorithm produces a valid 9-dimensional orthogonal set if I use only the first three vectors (the translational coordinates), I assume my implementation to be correct and the problem to be somewhere in the rotational coordinate vectors. But I am at loss about what exactly is going wrong here. (In the end, it's probably just an example of not seeing the forest for the trees ...)

Regards

-M.

  • 0
    I don't understand what it means to orthogonalize against a set of coordinates. Usually one orthogonalizes vectors. I also don't understand why you don't simply list the vectors you're using, since you've already listed $6$ of the $9$ and it seems obvious that the values of the other $3$ are relevant to the question.2012-10-10

4 Answers 4

0

Your 6 vectors form a six dimensional subspace in R9 (if they are independant), you need the 3 dimensional sub-space that is the orthogonal complement to that (the column space), which would be the left null space. The row, column, nullspace and left nullspace of a matrix maybe established from the process of reducing a matrix to reduced row echelon form EA=R. The left null space is read out of the E matrix, the three bottom rows multiply the matrix A to form the bottom three rows of R, which given the shape of A and R, 9x6, will be zero (if all your vectors are independant it will only be the last three rows of R that will be zero). This fact comfirms that they are perpendicular to all of the columns in A. The QR reduction is the way to go, but it is the last three rows of the inverse of Q that you want. Note that the columns of Q are orthogonal, and I don't think you can make that assumption about the rows. The following vectors represent the sub space you want, but you will need to use g-s to othonormalise them, make sure if you need the normailsing step.

0,0.3441,-0.2514,0,-1.3691,1,0,0,0

0,0,0,0,-1,0,0,1,0

0,-0.3441,-0.2514,0,1.3691,0,0,0,1

1

I realized I made a mistake right from the start. Gram-Schmidt orthogonalizes a linearly independent set of vectors. I previously simply filled my set up with the remaining Cartesian basis vectors (which would, in this case, be the last three columns of a $9\times9$ identity matrix times the mass-weighting) to create my "intermediate" coordinates to which I applied the G-S algorithm. However, this intermediate set is not linearly independent (just as the comments on the original question supposed) and thus, G-S produces a zero vector. No surprise there.

So the task at hand is now to construct the remaining $3N-6$ vectors so that the entire set is linearly independent. Is there any reliable method to achieve this? (Bonus points if all vectors are already mutually orthogonal, so I won't have to Gram-Schmidt them later.)

0

Your six vectors span only a six dimensional space, so a random vector shouldn't be in the span unless you are unlucky. The first, fifth, and sixth are all within the subspace of the first, fourth, and seventh coordinates, so I would just work in the six other dimensions and might try unit vectors in axis 2,3, and 5 for a start.

  • 0
    Okay, so at least I have a way to produce a linearly independent set now; that's fine even if it amounts to some trial-and-error. Thanks a lot! I will see if there is any reason for the other choices failing, maybe there really is a way to do this in a more elegant way.2012-10-17
0

A robust approach. Let $X \in \mathbb{R}^{3N \times 6}$ be the matrix with columns as the vectors of interest. Use an orthonormal rotation $U$ ($UU^T=\mathbf{I}$) such that $UX = \pmatrix{T \\ \mathbf{0}}$ where $T$ is upper triangular. This is done by reducing each column from a pivot point down with Givens rotations (or planar rotations) in a triangular fashion. See below for more specific detail on doing such a rotation.

We then have $X = U^T\pmatrix{T \\ \mathbf{0}}$

Since $T$ is triangular, it is easy to find $P$ that is it's inverse: $XP = U^T\pmatrix{T \\ \mathbf{0}}P = U^T\pmatrix{ \mathbf{I} \\ \mathbf{0}}$

This $P$ then represents your orthogonalization of $X$ since we have the column mixing of it gives an orthogonal basis $U^T$. The remaining columns in $U^T$ are then your completion of the space, also orthogonal, and you are done.

Givens reduction: $\pmatrix{c & s \\ -s & c}\pmatrix{ a \\ b}$

If either $a$ or $b$ is non-zero, choose $c= \frac{a}{\sqrt{a^2+b^2}} \quad \text{&}\quad s= \frac{b}{\sqrt{a^2+b^2}}$ so that $c^2 + s^2 = 1$ and $\pmatrix{c & s \\ -s & c}\pmatrix{ a \\ b} = \pmatrix{\sqrt{a^2+b^2} \\ 0}$

  • 0
    Q is exactly the matrix I am looking for, containing all 9 coordinate vectors and being orthogonal (of course, _per constructionem_). Only difference to the vectors I defined above is that the columns of$Q$are normalized to their inner product, while my previous definitions were not, but that's just a thing of cosmetics and it'll be sorted out later in the process. Many thanks again!2012-10-17