Consider $f$ belongs to an RKHS with Gaussian kernels:
$$ f(\cdot)=\sum_{i=1}^{n} a_i K(b_i,\cdot)$$
where $b_i$'s are given. I now wish to find
$$ g(\cdot)=\sum_{i=1}^{n} a_i' K(b_i,\cdot)$$
such that $\|f-g\|_\mathcal{H}^2$ is minimized, and $g(\cdot)>0$.
To formulate the objective, it is not hard to see , by the fact that $\langle K(x,\cdot),K(y,\cdot)\rangle=K(x,y)$, that minimizing $\|f-g\|_\mathcal{H}^2$ can be written as minimizing $(a-a')^\top M (a-a')$, where $a,b$ are coefficient vectors and $M_{ij}=K(b_i,b_j)$. $M$ is positive definite, and hence the problem is essentially a quadratic programming problem. The difficulty I'm experiencing is to find an efficient way to represent the constraint $g(\cdot)>0$.
For any given $t$, $g(t)>0$ is equivalent to a linear constraint for $a$. However $g(\cdot)>0$ makes the number of coefficients infinite. One can obviously discretize the time axis, but the cost of computation is high, and the constraint becomes weaker.
Is there any efficient way to represent the constraint $g(\cdot)>0$ as linear constraints over $a$?