2
$\begingroup$

Determine whether there is an inverse and then determine what the inverse is if it exists: \begin{bmatrix}a&b\\c&d\end{bmatrix} Where $ad-bc\not= 0$.

I know how to do this for a normal matrix, but I don't understand how to do this for one with the variables.

  • 1
    Find the inverse using the same method as you would when finding your "normal matrix", then multiply your inverse and the starting matrix and see if you get the Identity Matrix2017-02-17

4 Answers 4

1

Try to multiply your matrix by

$$\frac 1{ad-bc}\begin{pmatrix} d & -b \\ -c & a \end{pmatrix}$$

which is legit since $ad-bc\ne 0$.

What can you conclude?

One way of finding this particular matrix is using Gauss' reduction.

  • 0
    It seems like that's the way I was supposed to do this, but I don't understand why you would do it, and why the matrix is rearranged that way.2017-02-17
  • 1
    @JanoyCresva If you haven't noticed yet, what E.Joseph has written is the inverse of your matrix by definition. Look at how you would calculate the inverse of your normal matrix where variables are replaced by numbers, you might be doing the same process without really noticing it.2017-02-17
  • 0
    Yes, I just realized that it's the definition of the inverse of a 2x2 matrix. It just wasn't clear how he got there.2017-02-17
  • 0
    @JanoyCresva I did Gauss' reduction, but I am not able to get this matrix.2017-05-06
  • 0
    nevermind got it.2017-05-06
1

If a matrix has an inverse, we can find this inverse using the following method:(in this case, because $ad-bc \neq 0$, we certainly can find an inverse matrix, as you will notice throughout the work)

$(A|I_n) \equiv (I_n|A^{-1}) $

where $\equiv$ means that those 2 matrices are row equivalent. Thus, look for the row reduced echelon form of:

\begin{bmatrix}a&b&1&0\\c&d&0&1\end{bmatrix}.

What appears in the two last column will be the inverse you are looking for.

  • 0
    This was originally how I was trying to do it, but I didn't understand how I could reduce the variable part to the identity.2017-02-17
  • 0
    Start with $aR_2 - cR_1$, as you would do when the entries aren't variables.2017-02-17
  • 0
    I tried this. But no matter what, I would always end up with $\begin{bmatrix}-ac&0&adc/ad-bc&abc/ad-bc\\0&1&-c/ad-bc&a/ad-bc\end{bmatrix}$2017-05-06
  • 0
    @Math_QED How do I proceed now? I know that if $a$ is zero then $c$ has to be non-zero and vice-versa.2017-05-06
  • 0
    nevermind got it2017-05-06
  • 1
    Wow this post is old.2017-05-06
1

You need to solve this equation:

$$\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}a'&b'\\c'&d'\end{bmatrix} =\begin{bmatrix}1&0\\0&1\end{bmatrix}.$$

After expansion, you get the system

$$\begin{cases}aa'+bc'=1,\\ca'+dc'=0,\\ab'+bd'=0,\\cb'+dd'=1.\end{cases}$$ which splits in two $2\times2$ subsystems.

Then by Cramer

$$\begin{cases} a=\frac{\left|\begin{matrix}1&b\\0&d\end{matrix}\right|}{\left|\begin{matrix}a&b\\c&d\end{matrix}\right|}, b=\frac{\left|\begin{matrix}0&b\\1&d\end{matrix}\right|}{\left|\begin{matrix}a&b\\c&d\end{matrix}\right|},\\ c=\frac{\left|\begin{matrix}a&1\\c&0\end{matrix}\right|}{\left|\begin{matrix}a&b\\c&d\end{matrix}\right|}, d=\frac{\left|\begin{matrix}a&0\\c&1\end{matrix}\right|}{\left|\begin{matrix}a&b\\c&d\end{matrix}\right|}, \end{cases}$$

giving

$$\begin{bmatrix}\frac d{ad-bc}&-\frac b{ad-bc}\\-\frac c{ad-bc}&\frac a{ad-bc}\end{bmatrix}.$$

0

There are other answers. But I wanted to include this answer wherein we reduce the matrix to the identity and ensures the existence of the inverse.

Denote the given matrix by $A.$

1) $c=0$

If $c=0,$ then $a$ and $d$ both has to be non-zero, otherwise we would have $ad-bc=0.$

Row-reducing $\begin{bmatrix}a&b\\0&d\end{bmatrix}\to \begin{bmatrix}1&b/a\\0&1\end{bmatrix}\to \begin{bmatrix}1&0\\0&1\end{bmatrix}$

Thus $A$ is invertible.

2) $a=0$

If $a=0$ then $b$ and $c$ both has to be non-zero, otherwise we would have $ad-bc=0.$

Row-reducing $\begin{bmatrix}0&b\\c&d\end{bmatrix}\to\begin{bmatrix}0&1\\1&d/c\end{bmatrix}\to\begin{bmatrix}0&1\\1&0\end{bmatrix}\to\begin{bmatrix}1&0\\0&1\end{bmatrix}$

$A$ is invertible in this case too.

3) $a\ne 0, c\ne 0$

Row-reducing $\begin{bmatrix}a&b\\c&d\end{bmatrix}\to \begin{bmatrix}ac&bc\\ac&ad\end{bmatrix}\to\begin{bmatrix}ac&bc\\0&ad-bc\end{bmatrix}\to\begin{bmatrix}ac&bc\\0&1\end{bmatrix}\to\begin{bmatrix}ac&0\\0&1\end{bmatrix}\to\begin{bmatrix}1&0\\0&1\end{bmatrix}.$

Thus $A$ is invertible in all the cases and the inverse exists.

The explicit form of the inverse is given in one of the other answers to the question.