Suppose we have a basis for an integer lattice formed by the vectors $\vec v_1, \vec v_2, \ldots,\vec v_n$. Then let $A$ be the augmented matrix $( \vec v_1| \space \vec v_2| \cdots |\space \vec v_n)$.
Here is my question: is there an algorithm which performs elementary column operations on $A$ such that $\max(\|\vec u_1\|_p, \|\vec u_2\|_p, \ldots, \|\vec u_n\|_p)$ is a minimum, where $\vec u_i$ represent the new column vectors? The specific cases $p=1$, $p=2$, and $p=\infty$ are of particular interest to me.
Here are my thoughts for a slow-as-molasses approach:
I could first apply the LLL algorithm to get my vectors within a reasonable distance of the origin. Once that is done, a $L_p$ unit $n$-sphere could be drawn centered at the origin with radius stretched to the longest of the vectors. We could then brute-force the answer by checking every possible basis within this $n$-sphere.
EDIT:
It looks like the $p=\infty$ case can be reduced to a simpler problem, which is simply minimizing the absolute value of the largest element in the matrix. I have also found an article which looks promising in that it may have the answer for the $p=1$ case, but I'm having difficulty understanding some of the notation.