Prove that, if $a$ and $b$ are real numbers, at least one of which is not $0$, and $i = \sqrt{−1}$, then there is a unique complex number, say $c + di$, such that $(a + bi)(c + di) = 1$.
Proposition: If $a$ and $b$ are real numbers, at least one of which is not $0$, and $i = \sqrt{−1}$, then there is a unique complex number, say $c + di$, such that $(a + bi)(c + di) = 1$.
A (hypothesis): $a$ and $b$ are real numbers, at least one of which is not $0$, and $i = \sqrt{−1}$.
B (conclusion): There is a unique complex number, say $c + di$, such that $(a + bi)(c + di) = 1$.
My Work
B1: $(a + bi)(c + di) = 1$
$\therefore a + bi \not = 0$
B2: $c + di = \dfrac{1}{a + bi}$ where $a + bi \not = 0$
A1: $c + di = \dfrac{1}{a + bi}$ where $a + bi \not = 0$
$\implies (a + bi)(x + yi) = 1$ where $a + bi \not = 0$.
Therefore, in A1, I have constructed the object specified in the conclusion ($c + di$).
Now, below, I use the direct uniqueness method to show that the object is unique.
A2: There is a complex number $x + yi$ such that $x + yi = \dfrac{1}{a + bi}$.
$\implies (a + bi)(x + yi) = 1$ where $a + bi \not = 0$
A3: $(a+bi)(x + yi) = (a + bi)(c + di)$ where $a + bi \not = 0$
$\implies x + yi = c + di$
$Q.E.D.$
EDIT: My Work #2
A1: $(a + bi)(c + di) = 1$
$\implies ac + adi + bic + bdi^2 = 1$
$\implies ac + adi + bic - bd = 1$
$\implies (ac - bd) + i(ad + bc) = 1$ is a complex number, where $(ac - bd)$ is the real part and $i(ad + bc)$ is the imaginary part.
A2: Let $ac - bd = 1$ and $ad + bc = 0$
$\therefore 1 + 0 = 1$
Therefore, in A1, I have constructed the object specified in the conclusion ($c + di$).
Now, below, I use the direct uniqueness method to show that the object is unique.
A3: Let $(a + bi)(x + yi) = 1$
A4: $(a + bi)(c + di) = (a + bi)(x + yi)$ where ($a \lor b) \not = 0$.
$\implies c + di = x + yi$
$Q.E.D.$
I would greatly appreciate it if people could please take the time to review my proof for correctness and provide feedback. If there are any errors, please explain why and what the correct procedure is.