Let $R$ be the largest number such that $F=(f_1,f_2)$ is injective in $\{x\colon |x|. I don't know how to find $R$ precisely in terms of $a_i,b_i,c_i$ (and suspect there is no nice formula for it), but here are some estimates.
1) You can get a lower bound for $R$ from the following observation: if $F(\vec x)=\vec x+G(\vec x)$ where $G$ satisfies $|G(\vec x)-G(\vec y)|<|\vec x-\vec y|$ for all $\vec x,\vec y$ in its domain, then $F$ is injective. In particular, this applies when the Jacobian matrix of $G$ has norm strictly less than $1$. The matrix is $DG(x,y)=2\begin{pmatrix}a_1x+b_1y & b_1x+c_1y \\ a_2x+b_2y & b_2x+c_2y \end{pmatrix}$ However, the operator norm of a matrix, even of size $2\times 2$, can be messy. If we use the somewhat sloppy estimate by the sum of $|x|, $|y|, and bound the operator norm by the sum of the norms of rows, we'll get $\|DG\|\le 2R\left(\sqrt{(|a_1|+|b_1|)^2+(|b_1|+|c_1|)^2}+\sqrt{(|a_2|+|b_2|)^2+(|b_2|+|c_2|)^2}\right)$ So, the map is assured to be invertible when $R<\frac{1}{2}\left(\sqrt{(|a_1|+|b_1|)^2+(|b_1|+|c_1|)^2}+\sqrt{(|a_2|+|b_2|)^2+(|b_2|+|c_2|)^2}\right)^{-1}$
The above reasoning is basically what you get if you read the proof of Inverse Function theorem and try to make it quantitative.
2) Let's also get an upper bound on $R$. In general, the vanishing of Jacobian determinant does not imply the failure of injectivity. However, our map is a polynomial of 2nd degree. Suppose we expand it at a point where the Jacobian determinant vanishes. Then the linear part has a kernel, and the restriction of the map onto the kernel is of the form $(\alpha t^2,\beta t^2)$. Hence, the map is not injective in any neighborhood of such point. The set $\det DF=0$ is a quadric surface, so it is a feasible task to find the nearest point to $(0,0)$ on this surface. Again, this does not look like a pleasant task... I have to say, if this is a textbook/homework exercise, it appears to be poorly thought out.