0
$\begingroup$

I have a "linear" equation set as follows, with nonlinear optimization goal.

P(0) + P(1) = 1

P(0, 0) + P(0,1) = P(0)

P(0) < 1

P(1) < 1

P(0,0) > 0

P(0,1) > 0

P(1) > 0

The goal to optimize is

$P(0,1)\log{P(0,1)} + P(1)\log{P(1)} - (P(1)+P(0,1))\log{((P(1)+P(0,1)))}$

I know nonlinear optimization could be hard, but is there quick technique dor solving this special kind of nonlinear optimization problem?

  • 0
    It looks like you did not give all information needed. What is your optimization variable? Is it some kind of funktion $p$? Or is it just the values $P(0)$, $P(1)$,...? Anyway: Your objective function is convex, the constraints are linear, so a lot of theory applies here...2012-02-05
  • 0
    Sorry for confusion here. P(0) is a variable; in here, 0 is rather a label for the variable, not input for the function..2012-02-05
  • 0
    And what are $P(0,0)$ and $P(0,1)$?2012-02-06
  • 0
    P(0,1) and P(0,0) are optimization variables too.2012-02-07

1 Answers 1

1

The optimization problem can be written as the following:

$$J(a,b) = a\log a + b\log b - (a+b)\log (a+b)\quad \quad a, b>0, a+b< 1$$

Let $u=a+b$, $p = a / u$ be a reparametrization. Then the problem becomes:

$$J(u, p) = u(p \log p + (1-p)\log(1-p))\quad\quad 0 < \alpha < 1, 0 < u < 1$$

Fixing $u$, $J(u, p)$ attains its supremum $0$ for $p \rightarrow 0$ or $p\rightarrow 1$, and attains its minimium $-u$ for $p = 0.5$. Since $0