You have the likelihood function
$$
L(\theta_1,\theta_2) = \text{constant} \times (\theta_1\theta_2)^{n_1} (1-\theta_1\theta_2)^{n-n_1} (\theta_1\theta_2^2)^{n_2}(1-\theta_1\theta_2^2)^{n-n_2} \tag 1
$$
where "constant" means not depending on $\theta_1$ or $\theta_2$.
It start by letting $\alpha=\theta_1 \theta_2$ and $\beta = \theta_1 \theta_2^2.$ That transforms $(1)$ to
$$
\alpha^{n_1} (1-\alpha)^{n-n_1} \beta^{n_2} (1-\beta)^{n-n_2}.
$$
The logarithm of this expression is
$$
\ell = n_1 \log \alpha + (n-n_1)\log(1-\alpha) + n_2\log\beta + (n-n_2)\log(1- \beta).
$$
So we have
$$
\frac{\partial\ell}{\partial\alpha} = \frac{n_1} \alpha - \frac{n-n_1}{1-\alpha}.
$$
This is $0$ precisely when $\alpha = \dfrac{n_1}n$. By symmetry we similarly see that $\partial\ell/\partial\beta=0$ when $\beta = \dfrac{n_2} n.$
Given \begin{align} \theta_1\theta_2 & = \alpha = \frac{n_1} n \tag 2 \\[10pt] \text{and } \theta_1\theta_2^2 & = \beta = \frac{n_2} n \tag 3 \end{align} we can divide the left side of $(3)$ by the left side of $(2)$ to get $\theta_2,$ and doing the same with the right sides we get $\theta_2=n_2/n_1.$ We can divide the square of the left side of $(2)$ by the square of the left side of $(3)$ to get $\theta_1$, and then doing the same with the right sides we have $\theta_1 = n_1^2/n_2 n.$
Here we have used what you often see called the "invariance" of maximum-likelihood estimates, but it's really equivariance rather than invariance.
Note also the mere fact of the derivative being $0$ does not prove that there is a global maximum. In this case the function of either $\alpha$ or $\beta$ is $0$ at the two extreme points $0$ and $1$, and is positive between those two extremes, and is continuous. That proves there is a global maximum somewhere strictly between $0$ and $1$. The function is also everywhere differentiable, so the derivative must be $0$ at a non-endpoint maximum. And then we find that there is only one point where the derivative is $0$ and we can conclude that's where the global maximum is.