1
$\begingroup$

Let $$f(x,y)=(-c)\cdot x+y\cdot (Ax-b),$$ $x,c\in \mathbb{R}^n$, $y,b\in \mathbb{R}^m$ and $A\in \mathbb{R}^{m\times n}$. I want to prove that if $x^*$ is the solution to a linear programming problem $$\min\{c\cdot x\colon Ax=b, \ x\geq 0\}$$ and $y^*$ is the solution to the dual problem, then $(x^*,y^*)$ is function's $f$ saddle point. Any ideas on how to approach this?

  • 0
    Have you even glimpsed at $f(x,y^*)$ or $f(x^*,y)$?2017-01-19
  • 0
    I know that $f(x,y^*)=(-c)\cdot x+y^*(Ax-b)$ and $f(x^*,y)=(-c)\cdot x^*$.2017-01-19
  • 0
    So the latter does not depend on $y$, that makes it easy! Since $y^*$ satisfies the dual constraints, you can simplify the expression for $f(x,y^*)$.2017-01-19
  • 0
    Ok we get $f(x,y^*)=(-b)\cdot y^*$. How does it help us exactly?2017-01-19
  • 0
    The point $(x^*,y^*)$ satisfies the definition of a saddle point: there is no $x$ such that $f(x,y^*) < f(x^*,y^*)$ and no $y$ such that $f(x^*,y) > f(x^*,y^*)$. (And you need to use $+c$ instead of $-c$ as otherwise the inequalities should be reversed)2017-01-19

1 Answers 1

0

For $$ f(x,y) = -c \cdot x + y \cdot (Ax - b) = f(u) $$ with $u = (x_1,\dotsc,x_n, y_1, \dotsc, y_m)^\top$. The first partial derivatives are $$ \partial_k f = \partial_{x_k} f = y_i a_{ik} - c_k \quad (k \in \{1,\dotsc,n \}) \\ \partial_k f = \partial_{y_{k-n}} f = a_{(k-n)j} x_j - b_{k-n} \quad (k \in \{n+1,\dotsc,n+m \}) $$ or $$ \DeclareMathOperator{grad}{grad} \grad f = \begin{pmatrix} A^\top y -c \\ Ax-b \end{pmatrix} $$ The Hessian is $$ H_{ij} = \partial_i \partial_j f $$ with $$ H_{ij} = 0 \quad (i,j \in \{1, \dotsc, n\}) \\ H_{ij} = 0 \quad (i,j \in \{n+1, \dotsc, n+m\}) \\ H_{ij} = a_{(j-n)i} \quad (i \in \{1, \dotsc, n\}, j \in \{n+1, \dotsc, n+m\}) \\ H_{ij} = a_{(i-n)j} \quad (i \in \{n+1, \dotsc, n+m\}, j \in \{1, \dotsc, n\}) $$ or $$ H = \begin{pmatrix} 0 & A^\top \\ A & 0 \end{pmatrix} $$ It seems we need to show that $H$ has positive and negative eigenvalues to be indefinite and indicating a saddle point. Not sure if this is sufficient for Hessians with more than two variables.

$$ u^\top H u = x^\top A^\top y + y^\top A x = y^\top A x + y^\top A x = 2 y^\top A x $$

  • 0
    This method is useful for only specific $A$. I think using second partial derivative test is not the way to go here.2017-01-19
  • 0
    What was your idea?2017-01-19
  • 0
    Looking at $f(x,y^*)$ and $f(x^*,y)$ seems like a way to go but I haven't figured out full solution yet.2017-01-19