Following @Did's hint:
Step 1:
Let $X' = X - E(X), Y' = Y - E[Y]$ (I suppose I'm assuming here that they actually have expected values...). Then
$$
Cor(X', Y') = Cor(X, Y)
$$
If we can show that $Y' = c X'$, then we have
$$
Y = E[Y] - cE[X] + cX
$$
so letting
$$
a = E[Y] - c E[X] \\
b = c
$$
we then have $Y = a + bX$ as needed. In short: we need only consider the case where $X$ and $Y$ have mean zero.
Step 2:
Let's examine the case where the correlation is $+1$. (The $-1$ case is very similar, and I leave it to you).
By definition of the correlation, we have
$$
\frac{E(X'Y')}{\sigma_X' \sigma_Y'} = 1
$$
so that
$$
E(X'Y') = \sqrt{E(X'^2) E(Y'^2)}
$$
Now noting that $\langle \rangle = E(X'Y')$ is an inner product on the space of mean-zero random variables, the Cauchy-Schwarz-Bunyakovsky inequality tells us that in general,
$$
E(X'Y') \le \sqrt{E(X'^2) E(Y'^2)}
$$
with equality only in the case where $Y'$ is a multiple $cX'$ of $X'$.
I'm being a little sloppy here: what's really required is that $Y'$ and $X'$ are linearly dependent, but since $X'$ is nonzero, that's the same as $Y'$ being a multiple of $X'$. I'm being sloppy in another way: if you change the value of $X'$ or $Y'$ on a set of probability zero, then equality still holds. So what I can really conclude is that
$$
P(Y' - cX' \ne 0) = 0.
$$
But that's exactly what we needed to prove that
$$
Pr (Y = a + bX) = 1.
$$
(I'm pretty sure that the sloppiness here will offend some folks, but it gets across the main ideas, I believe; in the case where the domain of the random variables $X$ and $Y$ is finite, and where there are no massless atoms, what I've said is correct.)