Ever since I wrote my first $x$, it was drilled firmly into my head that generally, to "solve" for $n$ variables $\{x_1, \ldots, x_n\}$ you need to specify $n$ functions $\{f_i : X^n \to R\}$ that vanish at all your $x_i$. This was a generally necessary and sufficient condition to get a finite (nonzero) number of solutions.
Some years down the road, I learned linear algebra, and the case for systems of linear equations was clear: "well-posed" solve for x problems were identified by full rank. For rank-deficient matrices, either there were 0 or infinitely many solutions. The solution space in the latter case could still be quantified by dimension.
Is there a way to formalize this idea a little bit more rigorously for continuous functions in general? I sort of understand the idea of dimension in algebraic geometry—my idea is that the "effectiveness" of a system of equations is measured by the dimension of their variety, as each time you mod the coordinate ring by one equation, this is equivalent to "substituting" one equation into the other like we all did when we were kids. Does this algebraic dimension agree with a linear-algebraic intuition for dimension (dimension of tangent space)? Is there a concrete (possibly differential) way to compute this dimension efficiently by hand?