This is a homework problem, so I'll tag it as such, but I'm having a bit of trouble in my Real Analysis class. The problem I have is this:
Prove that the function $f$ is continuous at the point $a$ if and only if, for every $\epsilon>0$, there is a $\delta>0$ such that $|f(x)-f(y)|<\epsilon$ whenever $x$ and $y$ are both in the interval $(a-\delta, a+\delta)$.
I feel like this should be rather simple, but I just need a starting place - some of the logic is still new to me and I don't really know where to begin. If anyone would be so kind as to give me a hand I'd really appreciate it!
Thanks!!
EDIT: I know as much to say that if $f$ is continuous at $a$ then there exists an $\epsilon$-neighborhood about $a$ in the domain, but why does $x$ and $y$ being in the $\delta$-neighborhood of $a$ imply that they are in the $\epsilon$-neighborhood? If $x$ and $y$ were both in the $\epsilon$-neighborhood of $a$, $|f(x)-f(y)|$ would always be less than $\epsilon$, correct? And thus the conclusion, but how does the $\delta$-neighborhood come into play?