I have a very simple problem, the technique of which I want to apply to more difficult problems. Here is the simple example version:
Suppose we have four functions: $f_1(x)=\sin(x)$, $g_1(x)=0$, and $f_2(x)=\cos(x)$, $g_2(x)=0$. I want to show that there does not exist an $x\in\mathbb{R}$ for which $f_1(x)=g_1(x)$ and $f_2(x)=g_2(x)$ simultaneously. I proceed like this:
For a contradiction, suppose there exists an $x\in\mathbb{R}$ such that $f_1(x)=g_1(x)$ and $f_2(x)=g_2(x)$. Then multiplying the first expression by $\sin(x)$ and the second by $\cos(x)$ gives $$\sin^2(x)=0$$ and $$\cos^2(x)=0.$$ Now add the two above expressions to obtain $$\sin^2(x)+\cos^2(x)=0,$$ from which we conclude that $$1=0.$$ Since this is absurd then we have shown that no such $x\in\mathbb{R}$ exists and our proof is complete.
I want to be able to apply this method to more complicated functions $f_i\neq 0$ and $g_i\neq 0$. My question is this: (1) is the above technique correct for the simple example, and (2) what general tools might I use to solve such problems? Simultaneous equations spring to mind for linear $f$ and $g$, for example.