6
$\begingroup$

I am having difficulty solving this problem:

Let $a, b, c \in\mathbb{Z}$, $abc \neq 0$ and $a\neq c$ be such that $$\frac{a}{c} = \frac{a^2+b^2}{c^2+b^2}.$$

Prove that $a^2 + b^2 + c^2$ is not a prime number.

Thanks in advance!

  • 1
    And $3^2+1^2+3\cdot1=13$ and $3^2+2^2+3\cdot2=19$...2012-02-27
  • 0
    $3^2$+$1^2$+$3*1$=$13$ is prime2012-02-27
  • 0
    @user25838 maybe you missed something because it is clear there are some contrvariants2012-02-27
  • 0
    I have already editted my problem.2012-02-27
  • 0
    Maybe you change your title as well.2012-02-27
  • 0
    Thanks, I will change it now.2012-02-27

4 Answers 4

12

W.l.o.g we may assume that $a,c>0$. The equation $$ \frac{a}{c}=\frac{a^2+b^2}{c^2+b^2} $$ together with the assumption $a\neq c$ quickly gives us $ac=b^2$ as a corollary. Therefore we have $$ a^2+b^2+c^2=a^2+ac+c^2 $$ and the extra condition that $ac=b^2$ must be a perfect square.

There are two main cases. If $gcd(a,c)>1$, then that common divisor is also a divisor of $a^2+ac+c^2$, so this latter number won't be a prime. If the numbers $a$ and $c$ are coprime, then the equation $ac=b^2$ and unique factorization force both $a$ and $c$ to be squares. So we can assume that $a=p^2, c=q^2$ for some integers $p,q$. But then we see that $$ a^2+b^2+c^2=p^4+p^2q^2+q^4=(p^2+q^2)^2-p^2q^2=(p^2+pq+q^2)(p^2-pq+q^2). $$ Here $p\neq q$, so both these factors are $>1$, and the claim follows in this case, too.

  • 0
    another usefull description of solution too2012-02-27
7

$\rm\ a^2+b^2+c^2 = (a+c)^2-b^2 + 2\: (b^2-ac)\ $ so $\rm\:b^2\! = ac\:\Rightarrow\:$ it factors (difference of squares)

  • 0
    Cool way to show it. (+1)2012-02-28
  • 1
    Good to have you back giving one-liner solutions.2012-02-28
  • 0
    Don't you need to do some work to allow for $a+c = b+1,$ which only occurs (for integral $a,b,c$, given that $b^2 = ac$) when $a=b =c = 1.$2012-02-29
  • 0
    @Geoff It's intended to be a (big) hint, like most of my answers. But, alas, prepending "hint" here spools the one-liner.2012-02-29
3

NOTE: The "solution" below addressed the original quesion, which was "Prove that $a^2 +b^2 +ab$ is not a prime number", and was later changed to its present form. In fact, it is the case that every prime congruent to $1$ (mod $3$) has the form $a^2 +b^2 +ab$ for integers $a$ and $b,$ while no prime congruent to $2$ (mod $3$) has this form. The former (well-kown) statement can be proved in a fashion rather similar to Euler's proof that every prime congruent to $1$ (mod $4$) is a sum of two integer squares. In this case, however, one works with the ring of Eisenstein integers, $R = \mathbb{Z}[\omega],$ where $\omega$ is a primitive (complex) cube root of unity. This is a principal ideal domain. If $p \equiv 1$ (mod $3$) is a rational prime, then the multiplicative group of the field $\mathbb{Z}/p\mathbb{Z}$ contains an element of order $3$. Hence there is an integer $n$ such that $p$ divides $n^{3}- 1,$ but $p$ does not divide $n-1.$ Then $p$ divides $n^{2}+n+1,$ which factors as $(n- \omega)(n-\omega^{2})$ in $R.$ Since $p$ does not divide either of the two factors in $R,$ we must conclude that $p$ is not a prime in $R.$ Hence there are integers $a,b,c,d$ such that $p = (a - b \omega)(c- d\omega)$ in $R,$ where neither $a-b\omega$ nor $c-d\omega$ are units in $R.$ Then multiplying this expression by its complex coinjugate , we see that $p^2 = (a^2 +ab + b^2)(c^2 +cd +d^2).$ Now $a^2 +ab +b^2 \neq 1$ and $c^2 +cd +d^2 \neq 1$ as $a-b\omega$ and $c-d\omega$ are non-units in $R.$ Hence $a^2 +ab +b^2 = p$ (note that it is a positive quanitity). It is an easy exercise that if $q \equiv 2$ (mod $3$), then $q$ remains prime in $R,$ so $q$ can certainly not be written in the form $a^2 +ab+b^2$ for ratonal integers $a$ and $b.$

  • 0
    Since I wrote this answer, or maybe while I was writing it, the question seems to have changed!2012-02-27
  • 0
    That seems to be the case. Probably the OP had first reached the conclusion $b^2=ac$ on his/her own, and felt that the claim in the exercise was to show that $a^2+ac+c^2$ is never a prime. As we later saw, key bits from the background where left out. It happens here often for IMHO understandble reasons. Everything that you say is, of course, correct.2012-02-27
  • 0
    Yes, not a problem. I like the Eisenstein integers, and they don't get as much attention as the Gaussian integers, so I'll leave this "solution" to a no longer existent problem up.2012-02-27
  • 1
    Do explain at the top of your answer the situation, though,2012-02-28
1

if we arrange it by another way like this $(a*c^2+a*b^2)$=$(c*a^2+c*b^2)$ and then concatate similar terms we get $(a*c*(c-a))$=$b^2$*$(c-a)$ or $b^2$=$a*c$ so $a^2$+$b^2$+$c^2$=$a^2$+$c^2$+$a*c$ which already is know that it is prime for some variables

  • 0
    Why's $a^2+c^2 + ac$ not always prime ?2012-02-27
  • 0
    sorry i have updated missed something2012-02-27
  • 0
    Just try some numbers - e.g. take $a$ and $c$ both even. You should be clear about whether you mean "never prime" or "not always prime". The proof by dato shows that $a^2+b^2+c^2$ (for $a,b,c$ as you describe) is not always prime (this was clearer before the edit, but it still does this). But sometimes it will be, for example when $a=b=c=1$.2012-02-27
  • 0
    Oh, $a\ne c$. Sorry.2012-02-27
  • 0
    take a=5, c=6, result would be 91 which is divisible by 72012-02-27
  • 0
    so in some case it is prime in some case not2012-02-27
  • 0
    No! If we keep the extra assumption that $a\neq c$, then $a^2+b^2+c^2=a^2+ac+c^2$ is **NEVER** prime. See my answer. For example, if $a=1$, $c=3$, then the equation $b^2=ac=3$ does not have integer solutions for $b$.2012-02-27
  • 0
    aa sorry sorry i see2012-02-27