If the surface over which we integrate is a level set of $f$, then the gradient of $f$ gives a normal $\nabla f(p)$ at the point. However, this normal may not be of unit magnitude.
By definition $\iint_\Omega \overrightarrow F\cdot \overrightarrow{dS} $ where $\overrightarrow{dS}=\hat n dS$, so the integrand is $\overrightarrow F\cdot \hat n$ with $\hat n=\widehat{\nabla f}$. So I understand we need to normalize the gradient to find the integrand.
However, I think $dS=\|r_u\times r_v\|dudv$. But $r_u\times r_v=\nabla f$ so we end up with $$\iint_\Omega \overrightarrow F\cdot \overrightarrow{dS}=\iint_\Omega (\overrightarrow F\cdot \hat n)\circ r\|\nabla f\|dudv.$$ To me this looks equal to $$\iint_\Omega (\overrightarrow F\cdot \nabla f)\circ rdudv.$$
In the question Using Stoke's theorem evaluate the line integral $\int_L (y i + zj + xk) \cdot dr$ where $L$ is the intersection of the unit sphere and x+y = 0, the gradient is normalized to obtain $$\iint_\Omega(\overrightarrow F \cdot \hat n)\circ r \ dS=\frac 1{\sqrt 2}\iint_\Omega dS$$ but it is said equal to $\frac 1{\sqrt 2}\pi$, which is the result of saying $\iint_\Omega dS=\pi$. But I think $dS=\sqrt dudv$ and the answer should be $\pi$.
Who is correct?
