In Knuth's book "The Art Of Computer Programming Vol.1" there is a description about Euclid's algorithm to find the greatest common divisor of m and n.
And there is a phrase.
$m = qn+r$. If $r = 0$, then $m$ is a multiple of $n$, and clearly in such a case $n$ is the greatest common divisor of $m$ and $n$. If $r \neq 0$, note that any number which divides both $m$ and $n$ must divide $m -qn =r$ , and any number which divides both $n$ and $r$ must divide $qn+r =m$.
I don't understand why any divisor of $m$ and $n$ must divide $m-qn =r$.