5
$\begingroup$

Consider the set $(s_1, ..., s_N) \in S$, where all $s_i$ are positive integers selected from some interval $[M, L]$ and the sum of any $k$ integers in $S$ is required to be unique and to have a distance of at least $d$ from all other possible sums of $k$ integers in the set. For example, if $k = 2$, the sum of any two randomly chosen elements, i.e. $s_a$ and $s_b$, must generate a unique sum that has a difference of at least $d$ from all other possible sums of two elements in $S$.

As a function of $k$, the minimum difference between sums $d$, and the size of the interval, $(L - M)$, what is the largest possible size of the set $S$, and what is the most efficient way to compute its elements?

Edit - I'm of course happy to hear any $d = 1$ solutions.

Edit 2 - For the $d=1$ case, is this problem NP-complete, similar to the subset-sum problem?

2 Answers 2