I know its a little silly, but I got the wrong sign several times. Just to be clear, $z=r\cos(\phi), -\frac{\pi}{2}\leq\phi\leq\frac{\pi}{2}$ when converting from cartesian to spherical. So, how do I determine the sign? Thanks!
Finding the sign of $\phi$ in spherical coordinates
0
$\begingroup$
calculus
coordinate-systems
vector-analysis
spherical-coordinates
-
0If $\phi$ here is supposed to be co-latitude, the version of spherical coordinates that I am accustomed to has it go from $0$ (North Pole) to $\pi$ (South Pole). It seems as if you're measuring from the equator instead of from the poles? – 2012-02-15
-
0Yes, exactly. But the method would be the same with little changes, wouldn't it? – 2012-02-15
-
0If, as you claim, the changes are "little", then you can figure out how to correct $\phi=\arccos\frac{z}{r}$, which assumes measurement from the poles, to your preferred convention, no? – 2012-02-15
-
0I just happen to be a TA in a calculus course this semester and we just got to spherical coordinates. We teach that $0\leq\phi\leq\pi$ and $z=r\cos\phi$. – 2012-06-17