There's a proof in this paper by Stef Tijs that the approach Henning suggests of maximizing $x_1 + \epsilon x_2 + \epsilon^2 x_3 + \cdots + \epsilon^{n-1} x_n$, for sufficiently small $\epsilon$, will work. Unfortunately, the proof is nonconstructive, so it doesn't tell you exactly which value of $\epsilon$ to use. It just proves that there exists some $\hat{\epsilon}$ such that, for all $\epsilon \leq \hat{\epsilon}$, maximizing $x_1 + \epsilon x_2 + \epsilon^2 x_3 + \cdots + \epsilon^{n-1} x_n$ over $P$ will give you the lexicographic maximum of $P$. I think Henning's suggestion to let $\epsilon$ be smaller than the ratio between any two nonzero entries of $A$ should work, though.
Yuval's idea of finding the lexicographic maximum by iteratively finding the value of $x_1$, then $x_2$, and so forth should also work. The disadvantage there is that it might take up to $n$ separate linear programming problems, while Henning's suggestion for small enough $\epsilon$ would find the lexicographic maximum with just one LP.