I am a newbie here and now facing to a hard convex optimization problem, sincerely wish anyone can help me : ) There is a variable vector $\textbf{x}$, and we wish to minimize the sum of $f()$: $\min~ \left(\sum f(x_i)\right),$ where $f()$ is a function with many log and exp, thus $\textbf{x}$ has no closed-form expression. Until here I know I can solve by gradient descent method to get numerical results, but we further have one constraint, that, $\text{s.t. }\sum g(x_i) \le C,$ where $C$ is a constant and $g()$ is also a function with many log and exp, thus $\textbf{x}$ still has no closed-form expression...
Note that $f$ and $g$ are all with 1st and 2nd derivatives, and we already proved the convexity of $f$ and $g$. What we need to do is just to numerically solve for the vector $\textbf{x}$.
I used Mathematica and Matlab by some of their eternal commands/tools, but failed; and also I manually tried Lagrange dual solution, but by the end, due to the non-closed-form of $\textbf{x}$, we still cannot solve the derivative equations. I guess this is related to non-closed-form convex optimization, or gradient descent with constraints, but I am quite weak at Math, so, I just sincerely wish someone can point me out on any effective methods or algorithms ; ) (Or maybe I missed some better eternal function tools by Mathematcia and Matlab, you can also feel free to figure out.. ^^)
I do really appreciate your help on this, maybe this is quite easy to you guys, but this can save my life =)
THANK YOU VERY MUCH!