0
$\begingroup$

Let $G=(V,E)$ be a graph, where $|E|=m$, and suppose we formulate the shortest path problem on $G$ as follows: minimize ${}^t(1,\dots,1)x$ such that $Bx={}^t(1,-1,0,\dots,0), x\in \{0,1\}^m$, where $B \in M_{n\times m}(\mathbb Z)$ is the incidence matrix (whose rows correspond to vertices and whose columns to edges) of $G$.

Here I want to show that the optimal solution doesn't change if $x$ runs through $[0,1]^m$ instead of $\{0,1\}^m$. I've already shown that we may assume $x$ runs through $[0, \infty)$, i.e., the vertices of the polytope defined by the problem are integral. It remains to show that the optimum is attained by $x \le 1$.

How can I show that?

1 Answers 1

2

It's trivial. When $x\in [0,1]^m$, it stands for a linear combination of paths(I can make this more constructive if you can't figure out this fact after some thinking). The weighted average length of these paths is certainly no smaller than the length of the shortest path, thus resolving this problem.