12
$\begingroup$

The inverse of a non-singular lower triangular matrix is lower triangular.

Construct a proof of this fact as follows. Suppose that $L$ is a non-singular lower triangular matrix. If $b \in \mathbb{R^n}$ is such that $b_i = 0$ for $i = 1, . . . , k \leq n$, and $y$ solves $Ly = b$, then $y_i = 0$ for $i = 1, . . . , k \leq n$. (Hint: partition $L$ by the first $k$ rows and columns.)

Can someone tell me what exactly we are showing here and why it will prove that the inverse of any non-singular lower triangular matrix is lower triangular?

  • 0
    [Related](http://math.stackexchange.com/q/1058322/28900).2015-11-12

4 Answers 4

20

Let's write $L^{-1}=[y_1\:\cdots\:y_n],$ where each $y_k$ is an $n\times 1$ matrix.

Now, by definition, $LL^{-1}=I=[e_1\:\cdots\:e_n],$ where $e_k$ is the $n\times 1$ matrix with a $1$ in the $k$th row and $0$s everywhere else. Observe, though, that $LL^{-1}=L[y_1\:\cdots\:y_n]=[Ly_1\:\cdots\: Ly_n],$ so $Ly_k=e_k\qquad(1\leq k\leq n)$

By the proposition, since $e_k$ has only $0$s above the $k$th row and $L$ is lower triangular and $Ly_k=e_k$, then $y_k$ has only $0$s above the $k$th row. This is true for all $1\leq k\leq n$, so since $L^{-1}=[y_1\:\cdots\:y_n],$ then $L^{-1}$ is lower triangular, too.

$********$

Here's an alternative (but related) approach.

Observe that a lower triangular matrix is nonsingular if and only if it has all nonzero entries on the diagonal. Let's proceed by induction on $n$. The base case ($n=1$) is simple, as all scalars are trivially "lower triangular". Now, let's suppose that all nonsingular $n\times n$ lower triangular matrices have lower triangular inverses, and let $A$ be any nonsingular $(n+1)\times(n+1)$ lower triangular matrix. In block form, then, we have $A=\left[\begin{array}{c|c}L & 0_n\\\hline x^T & \alpha\end{array}\right],$ where $L$ is a nonsingular $n\times n$ lower triangular matrix, $0_n$ is the $n\times 1$ matrix of $0$s, $x$ is some $n\times 1$ matrix, and $\alpha$ is some nonzero scalar. (Can you see why this is true?) Now, in compatible block form, we have $A^{-1}=\left[\begin{array}{c|c}M & b\\\hline y^T & \beta\end{array}\right],$ where $M$ is an $n\times n$ matrix, $b,y$ are $n\times 1$ matrices, and $\beta$ some scalar. Letting $I_n$ and $I_{n+1}$ denote the $n\times n$ and $(n+1)\times(n+1)$ identity matrices, respectively, we have $I_{n+1}=\left[\begin{array}{c|c}I_n & 0_n\\\hline 0_n^T & 1\end{array}\right].$ Hence, $\left[\begin{array}{c|c}I_n & 0_n\\\hline 0_n^T & 1\end{array}\right]=I_{n+1}=A^{-1}A=\left[\begin{array}{c|c}ML+by^T & M0_n+b\alpha\\\hline x^TM+\alpha y^T & y^T0_n+\beta\alpha\end{array}\right]=\left[\begin{array}{c|c}ML+by^T & \alpha b\\\hline x^TM+\alpha y^T & \beta\alpha\end{array}\right].$ Since $\alpha$ is a nonzero scalar and $\alpha b=0_n$, then we must have $b=0_n$. Thus, $A^{-1}=\left[\begin{array}{c|c}M & 0_n\\\hline y^T & \beta\end{array}\right],$ and $\left[\begin{array}{c|c}I_n & 0_n\\\hline 0_n^T & 1\end{array}\right]=\left[\begin{array}{c|c}ML & 0_n\\\hline x^TM+\alpha y^T & \beta\alpha\end{array}\right].$ Since $ML=I_n$, then $M=L^{-1}$, and by inductive hypothesis, we have that $M$ is then lower triangular. Therefore, $A^{-1}=\left[\begin{array}{c|c}M & 0_n\\\hline y^T & \beta\end{array}\right]$ is lower triangular, too, as desired.

  • 0
    There's nothing really wrong with the proof, but it considers an extra case that isn't really necessary for us--namely, the $k=n$ case. In that case, $b$ is the zero vector, and since $L$ is nonsingular, then $Ly=b$ implies $y=0$. None of the columns of the identity matrix are the zero vector, so we didn't need that case. Ignoring that case, then saying "$b_i=0$ for i=1,...,k" is equivalent to saying "$b$ is all $0$s above the $(k+1)$th row." $e_{k+1}$ satisfies this condition for each 1\leq k. We don't need to worry about $0$ entries in $e_1$.2012-11-28
3

Suppose you have an invertible lower-triangular matrix $L$. To find its inverse, you must solve the matrix equation $LX = I$, where $I$ denotes the $n$-by-$n$ identity matrix.

Based on how matrix multiplication works, the $i^{\text{th}}$ column of $LX$ is equal to $L$ times the $i^{\text{th}}$ column of $X$. In order for $LX = I$, it must be that the first $i-1$ entries in the $i^{\text{th}}$ column of $LX$ are all zero. The hint is that you can prove that this implies that the first $i-1$ entries in the $i^{\text{th}}$ column of $X$ must all be zero. To do this, you can explicitly write out your calculation, using your assumption that $L$ is lower-triangular. You'll get a fairly easy system of linear equations to analyze.

  • 0
    It may help to point out that when you are considering the $j^{\text{th}}$ column, you should take $k = j-1$.2012-11-27
3

In simple form, we can write A = D*(I+L); where A is lower triangular matrix, D is diagonal matrix, I is identity matrix and L is lower triangular with all zeros in diagonal. Since $A^{-1} = (I+L)^{-1}*D^{-1}$ and inverse of D is simply inverse of diagonal element. And for very large n $L^{-n} = 0$ since it is having only lower triangular elements. And we can write $(I+L)^{-1} = I - L + L^2 - L^3 + .... (-1)^n*L^n$ which itself is lower triangular matrix.

1

I was thinking about this same question and have an explanation from an informal perspective:

With invertible matrix $A$ and $LA = B$

We know that the first row of $B$ is a multiple of the first row of $A$, and the second row of $B$ is a linear combination of the first two rows of $A$, ..., the $k$th row of $B$ is a linear combination of the first $k$ rows of $A$,...

It follows that for any $k$, the first $k$ rows of $A$ and the first $k$ rows of $B$ span the same subspace. Therefore the $k$th row of $A$ is in the subspace spanned by the first $k$ rows of $B$. Furthermore, the $k$th row of $A$ cannot be in the subspace spanned by the first $k-1$ rows of $B$. Otherwise it is in the subspace spanned by the first $k-1$ rows of $A$, which contradicts the assumption that $A$ is invertible (rows are linearly independent). Because for any $k$ the $k$th row of $A$ is a linear combination of the first $k$ rows of $B$, for $L^{-1}B=A$ $L^{-1}$ must be lower triangular.