3
$\begingroup$

Suppose I have a map $f: \mathbb R^{N} \mapsto \mathbb R^{N}$ of multivariate polynomial form of degree $K$:

$$ f^i: X \mapsto A^{i}_0 + A^{ij}_1 X^{j} + A^{ijk}_2 X^j X^k + \ldots + A^{i i_1 \cdots i_K}_K X^{i_1} \cdots X^{i_K} $$

(a sum is implied for each repeated index).

What can be said about the topology of the manifold defined by

$$ J(f) = \det \left( \frac{\partial f^i}{\partial X^j } \right) = 0 .$$ For instance, what can be said of its dimensionality? How does it depend on $N$ and $K$?

  • 0
    This is just a nitpick, but in order to correctly apply the summation convention correctly, the sum must occur over repeated *upper* and *lower* indices. Therefore, to properly align with the summation convention, you should lower the $j, jk, ...$ indices and make them subscripts of the $A^i$ elements2011-11-21
  • 0
    Note that the space in question is a (real) algebraic variety, but is not necessarily a manifold. In particular, its "dimension" is only locally defined and may vary from point to point.2011-11-21
  • 1
    @3Sphere, in this case there is no distinction between upper and lower, unless i am missing something?2011-11-21
  • 0
    @lurscher The point about the summation convention is that repeated indices are never summed over unless one index occurs in the upper position and one index occurs in the lower position. So, the expression $a^i_j b^j_k$ implies a sum over $j$ while $a^i_j b^i_k$ implies no summation at all. In your case, the conflict could be resolved by locating the summation indices you have on $A_k$ to the right of your $K$ index as a subscript; this obviously doesn't change the content of your equation, it's just a device to allow the summation convention to be correctly applied.2011-11-21
  • 1
    @3Sphere: I think you're referring to what's called the Einstein summation convention. lurscher's convention is of his own choosing and he's not bound by any convention of Einstein.2011-11-24

1 Answers 1

0

The only thing special about your variety is that it's the zero-set of the determinant of a matrix such that the mixed partials (of the matrix) agree. So in dimension 1, there's nothing special at all -- every variety is of this form. In higher dimensions, take any polynomials $a_{ij} \in \mathbb R[x_1,\cdots,x_n]$ such that $\frac{\partial a_{ij}}{\partial x_k} = \frac{\partial a_{ik}}{\partial x_j}$ for all $\{i,j,k\}$. Then the variety $Det\begin{pmatrix}a_{11} & \cdots & a_{1n} \\ \cdots & & \cdots \\ a_{n1} & \cdots & a_{nn} \end{pmatrix}=0$ is of the form you're interested in, since the matrix is the Jacobian of some function.

I suppose there's perhaps something you could say about such varieties but it seems like you can get all kinds of behaviour.

Example: $Det\begin{pmatrix}x+x^2 & y-y^{100} \\ x+y & x+y^2\end{pmatrix} = x^3+(y^2+1)x^2 + (-y^{101}+y^2)x-y^{102}$

and so on. Do you have a reason to think there's anything special about these varieties?

  • 0
    For example, if $K$ is odd, and you pick an arbitrary form $f_1$ of degree $K$ and then put $f_i=x_i^K+x_i$ for each $i\in\{2,\dots,n\}$, the Jacobian is $\tfrac{\partial f_1}{\partial x_1}\prod_{i=2}^n(Kx_i^{K-1}+1)$, whose real zeroes are the same as those of $\tfrac{\partial f_1}{\partial x_1}$, which can have any dimension (as a topological manifold)2011-11-24
  • 0
    @Ryan, the only reason to be concerned with this variety is because i suspects it gives the boundary of convergence of a polynomial expansion of some function, as i'm asking in this related question: http://math.stackexchange.com/questions/82860/domain-of-convergence-of-f-mathbb-r-n-mapsto-mathbb-rn-taylor-series2011-11-24
  • 0
    so, knowing how far i am from this variety would give me an idea of how good is going to turn out my Taylor expansion convergence (or at least thats the seminal idea)2011-11-24
  • 0
    @lurscher: I think perhaps you're thinking about your Taylor series convergence issue in a perhaps not too useful way. For example, $f(x) = 1-x^2$ has a Taylor expansion which converges with infinite radius. But the derivative matrix is singular at $x=0$.2011-11-24