Suppose we have an infinite fluid surrounding a ball. I want to solve the following PDE, which describes the heat diffusion in the fluid (simplifications have been made): $$\partial_t\theta=\alpha\Delta\theta$$ with boundary conditions $\theta(r=R,t)=0$ and $\lim_{r\to\infty}\theta(r,t)=0$, and initial condition $\theta(r,t=0)=f(r)$. Using separation of variables $\theta(r,t)=F(r)G(t)$. So for $G$ I obtain the ODE $$ G'+\alpha\lambda^2 G=0 $$ which integrates into $G(t)=A\exp\left(-\alpha\lambda^2 t\right)$, where $-\lambda^2$ is the separation constant. Now for $F(r)$ I obtain the ODE $$ F''+\frac{2}{r}F'+\lambda^2 F=0 $$ which integrates into $F(r)=B\frac{\sin(\lambda r)}{r}+C\frac{\cos(\lambda r)}{r}$. So the second boundary condition is automatically satisfied, and the first one yields $$ B\sin(\lambda R)+C\cos(\lambda R)=0. $$ But I don't know how to go further. If I put $B=C=0$, then I obtain the trivial solution, which isn't feasible. I could put $B=0$ and $C\neq 0$, or $B\neq0$ and $C=0$, but then the corresponding values of $\lambda$ would yield two different solutions. Finally if I keep both $B\neq 0$ and $C\neq 0$ then I obtain $D\tan(\lambda R)=1$ where $D=-\frac{C}{B}$, from where I cannot go further.
Thanks for any help, or any textbook that deals with that kind of problem.