I would like to find the average distance from center of ellipse centered at the origin to some point on the eclipse.
$$(\frac{x}{a})^2+(\frac{y}{b})^2=1$$
I thought of converting to polar so,
$$r^2((\frac{\cos \theta}{a})^2+(\frac{\sin \theta}{b})^2)=1$$
$$r=\left((\frac{\cos \theta}{a})^2+(\frac{\sin \theta}{b})^2 \right)^{-1/2}$$
So the distance I am looking for is,
$$\frac{1}{2\pi} \int_{0}^{2\pi} \left((\frac{\cos \theta}{a})^2+(\frac{\sin \theta}{b})^2 \right)^{-1/2} d\theta$$
Correct? I don't know how to go from here.
I also thought about alternatively using line integrals so that what I am looking for is,
$$\frac{\int_C \sqrt{x^2+y^2} ds}{\int_C ds}$$
Then using the parametrization $x=a\cos\theta$ and $y=a\sin \theta$ but that still doesn't get me anywhere.
My question is my approaches correct, and what can I do to proceed?
Thanks in advance.