I take it that you meant to ask about the solution of the equation and its asymptotic behaviour, not its complexity. This is related to complexity in that if you can solve a problem of size $n$ by splitting it into roughly $\sqrt{n}$ parts of size roughly $\sqrt{n}$ and doing work $n$ in the process, $T$ gives the amount of work required for solving a problem of size $n$, and the asymptotic behaviour of that work determines the complexity; however, this is the complexity of the algorithm, not of the equation.
As Aryabhata pointed out in the comments, Didier's approach is good but the answer was wrong. Starting from Didier's rewritten version,
$\frac{T(n)}n = \frac{T(\sqrt{n})}{\sqrt{n}}+1\;,$
we can introduce $a(n)=T(n)/n$ to obtain
$a(n)=a(\sqrt{n})+1\;.$
With $b(k)=a(\mathrm e^k)$, this becomes
$b(k)=a(\mathrm e^k)=a(\mathrm e^{k/2})+1=b(k/2)+1\;.$
Then $c(j)=b(2^j)$ yields
$c(j)=b(2^j)=b(2^j/2)+1=b(2^{j-1})+1=c(j-1)+1\;.$
This is trivially solved by $c(j)=j+C(\{j\})$ (where $\{j\}$ is the fractional part of $j$), and retracing the substitutions yields
$T(n)=na(n)=nb(\log n)=nc(\log_2\log n)=n(\log_2\log n+C(\{\log_2\log n\})\;.$
This is $\Theta(n\log\log n)$ if $C$ is bounded (which in practice it would be).