4
$\begingroup$

I'm working on a problem from a past exam and I'm stuck, so I'm asking for help. Here it is: $A = \frac12 \left[\begin{array}{rrrr} 1 & 1 & 1 & 1 \\ 1 & 1 & -1 & -1 \\ 1 & -1 & 1 & -1 \\ 1 & -1 & -1 & 1 \end{array}\right]$ find $\mathbf A^{-1}$.

My problem isn't the inverse matrix itself. We just get the determinant, see if it's zero or not, get the adjoint matrix and divide it by determinant.

My problem is space. As you can see, it's a 4x4 matrix meaning that I'd have to do 4x4 3x3 determinants to get the adjoint matrix plus 2 3x3 determinants to get determinant of the matrix. Now we get one A3 piece of paper for 6 problems. The problems are printed on one side and the other side is blank. This and the fact that inverse matrix is $A = \frac12 \left[\begin{array}{rrrr} 1 & 1 & 1 & 1 \\ 1 & 1 & -1 & -1 \\ 1 & -1 & 1 & -1 \\ 1 & -1 & -1 & 1 \end{array}\right]$

led me to believe that there's some catch that I do not see. Any ideas what could it be?

Also if someone could edit these matrices from MATLAB format into something that this site will parse would be great!

EDIT Unfortunately it seem that TeX code for matrices doesn't work here. Here's the matrix in MATLAB form, if anyone wants it A=(1/2)*[1,1,1,1;1,1,-1,-1;1,-1,1,-1;1,-1,-1,1];

EDIT 2 Answer by Jack Schmidt contains code for matrices.

  • 0
    @Tobias Kienzler Jack Schmidt posted code which makes matrices work!2010-08-10

4 Answers 4

3

I guess A is orthogonal and symmetric, so that tells you A−1 = AT = A, but uh, that's not a very common situation to my mind. Maybe someone else has a better "test-taking strategy" explanation, but me personally, I would just row reduce or whatever method you use in general.

An orthogonal matrix is defined to be a matrix whose transpose is its inverse. However, for us the better (almost) definition is a matrix whose rows (or columns) are orthogonal, as in, perpendicular. So (1,1,1,1) is orthogonal to (1,-1,1,-1) since their dot product is (1)(1)+(1)(-1)+(1)(1)+(1)(-1) = 1 - 1 + 1 - 1 is zero. You also should check that the length of the vector in each row is 1, $\sqrt{(1/2)^2 + (1/2)^2 + (1/2)^2 + (1/2)^2} = 1$ so good, but even if not, that part is easily fixed.

Sometimes you can tell just by looking that a matrix is orthogonal.


As far as parsing goes:

Here is the matrix:

$A = \frac12 \begin{pmatrix} 1 &  1 &  1 &  1 \\\\ 1 &  1 & -1 & -1 \\\\ 1 & -1 &  1 & -1 \\\\ 1 & -1 & -1 &  1 \end{pmatrix}$ 

$A = \frac12 \begin{pmatrix}1&1&1&1\\1&1&-1&-1\\1&-1&1&-1\\1&-1&-1&1\end{pmatrix}$

Here is using the array environment:

$A = \frac12 \left(\begin{array}{rrrr} 1 &  1 &  1 &  1 \\\\ 1 &  1 & -1 & -1 \\\\ 1 & -1 &  1 & -1 \\\\ 1 & -1 & -1 &  1 \end{array}\right)$ 

$A = \frac12 \left(\begin{array}{rrrr}1&1&1&1\\1&1&-1&-1\\1&-1&1&-1\\1&-1&-1&1\end{array}\right)$

The backslashes get eaten by the markdown software, so you just double them.

  • 0
    Ugh, I deleted my two comments after looking at the original post again on a different computer; for some reason I did not see the factor of$1/2$tucked into the original. To repeat: A is involutory (A^2=I) and symmetric; which also implies orthogonality. In fact, it is a Householder matrix, a matrix of the form I-cvv^T, where c=1/sqrt(2) and v is (1 -1 -1 -1)^T2010-08-11
2

If you have a $n\times n$ matrix $A$ , the inverse matrix $A^{-1}$ can be computed by using a compact method, that consists of the following steps:

  1. Augment your matrix with the identity matrix $I$: $\left( A|I\right) $

  2. Use Gaussian elimination to get an upper triangular matrix on the left and a matrix $J$ on the right: $\left( U|J\right) $, where $J=\left( J_{1}|J_{2}|\ldots |J_{n}\right) $.

  3. To get the $n$ vector columns of the inverse matrix: $A^{-1}=\left( x_{1}|x_{2}|\ldots |x_{n}\right) $, solve for $x_{k}$, with $k=1,2,\ldots ,n$, $n$ systems of equations $U\cdot x_{k}=J_{k}$.

Note 1: in your problem $n=4$.

Note 2: there is a variant of this method called "Gaussian elimination with partial pivoting" that gives a better accuracy, when the entries are not rational numbers, which is not the present situation.

Until now I have not yet computed matrix $A^{-1}$ in this case. Instead I have used The Scientific Notebook (included in the Scientific Work Place) to find that your matrix satisfies $A=A^{-1}$:

$A=\begin{pmatrix}1/2&1/2&1/2&1/2 \\ 1/2&1/2&-1/2&-1/2\\1/2&-1/2&1/2&-1/2\\1/2&-1/2&-1/2&1/2\end{pmatrix}=A^{-1}$

  • 0
    @J. Mangaldan, The Crout scheme (http://en.wikipedia.org/wiki/LU_decomposition) is new to me. I had thought on the following Theorem. If we apply elementary operations to the lines of the augmented matrix (A|I) and transform it into (I|B), then B = A^{−1}.2010-08-11
1

Have you tried reducing the matrix to row echelon form? Maybe my answer to this question could be helpful.

  • 0
    OT: SF$A$IK one only gets notified for comments to their posts or $a$nswers; I would understand why you may have missed my comment. Getting back to the matter at hand, this being a 4-by-4 system, Cramer's is definitely not a good idea here. I left a quote on this very topic in the question Agusti linked to.2010-08-11
0

Gauss/Jordan Elimination will do it. It'll let you find |A|^1 with out the bother of finding the determinant. Just augment your original matrix with the identity and let her rip.

On an aside, you can still deduce the determinant from the inverse.

{ |A|^1= (1/det)[adj|A|]

therefore the determinant is equal to the lowest common denominator of all of the elements of the inverse.