It can be easily seen that $F(z)$ is always holomorphic because it is defined by a Taylor expansion and Taylor-expanded functions are always holomorphic within the disk where the Taylor expansion converges: the only thing we have to avoid are singularities.
And the radius of convergence is surely bigger than $1$ because $f(\theta)$ which is the value of $F(z)$ on the unit circle is, by assumption, convergent and well-defined. (If the real part of Taylor expansion is convergent, the imaginary part has to be as well.) Because the Taylor expansion converges on the unit circle, it must also converge for all smaller values of $|z|$ i.e. inside the unit disk.
The statement would actually work even for distributions, not just "genuine" functions $f(\theta)$. For example, if $f(\theta)=\delta(\theta)$, then the Fourier transform is proportional to $1+2\sum_{n\in Z^+} \cos n\theta$ and the corresponding Taylor expansion for $F(z)$ will be $\sum_{n\in Z+} z^n$ which is geometric series converging to $1/(1-z)$.
One picks the delta-function using the usual rules for the distributions, $1/(\theta+i\epsilon)$ is equal to the principal value minus $i\pi$ times $\delta(\theta)$ as long as the imaginary value is picked from the expansion. So distributions will produce singularities of $F(z)$ at the unit circle - but nothing strictly inside the unit disk.