Let $f:[0,\infty)^n\to \mathbb{R}$ be a function continuously differentiable in the interior $(0,\infty)^n$ and that $\frac{\partial}{\partial x_j}f(\textbf{x})\to -\infty$ as $x_j\to 0^+$ for $j=1,\dots,n$.
Can it be shown rigorously that when this function is minimized over a set determined by a linear equation say $\{\textbf{x}=(x_1,\dots,x_n):\sum_j a_j x_j=b, x_j\ge 0\}$, the minimizer doesn't have a $0$ entry at a position when the constraint set allows non zero entries for that position?
Thanks.