I'm searching for an algorithm that efficiently does the following
Input:
- $n\in \mathbb{N}$
- $j_1,j_2,\ldots ,j_n$ with the property $j_1
- $b_1,\ldots ,b_n \in \mathbb{N}$
- $m\geq b_1+\cdots+b_n$
Output: a Matrix of $n\times m$ with the property described bellow:
To be more precise, I have a special case of an $n\times m$ matrix ($n$ is given, $m$ has a bounded lower value equal to $b_1+b_2+\cdots+b_n$ but we can choose it to be arbitrary large if we like).
We are also given a series of coordinates $j_1<\cdots
From all the possible matrices we're interested in finding one ( one certainly exists) that has the the following property:
$\max\{|j_k-\mathrm{positionOf}(1)|, k= 1,\ldots,n$ and for all 1 appearing on each of the $i_k$ line$\}$ is the lowest of all the other possible matrices. Less formal, I want to find one of the matrices that minimizes the biggest distance of '1' from $j_k$ for all $j_k$.
Is there an algorithm (preferably of linear complexity) to find such an array? My best effort so far solves it in polynomial time but it's really a brute force solution and not very applicable.
Thank you in advance.