I also assume that each of the $m_k$ are positive integers. I will also assume that are testing to see if the series either 'must' diverge or 'must' converge.
The intuition behind this question seems a bit fuzzy to me, as I don't like thinking of the number of partitions of 'infinity.' Here are a couple of the strange things that hit me: we care at what rate we let j and n go to infinity. For example, if we consider the problem as $j = n \to \infty$, then we hit a point where 'most' of the $m_k$ will be identically 1, and so their series diverges.
Of course, we can't let j go to infinity faster than n, as we require $m_k > 0$. So all we have left are the cases when n increases faster than j. Does is matter how much faster? Absolutely! Suppose that $n = O(e^j) $ and we let $j \to \infty$. Then n is so much larger than j for large enough n that we would need the average size of the $m_k$ to be very large. But now we have a conundrum - no matter how much larger n is than j, we could just make all the $m_k$ be 1 except for 1, which holds the remainder. That is, there are j-1 degrees of freedom in our sum (so long as they don't add up to more than n-1, of course). If we allow all $m_k$ but $m_0$, say, to be 1, then the reciprocal series will diverge as n gets large. But if choose the $m_k$ all be about the same size as the average size, i.e. somewhere around $\dfrac{e^j}{j}$, then their reciprocal series will converge.
So perhaps the interesting question is instead: how much faster does n need to grow in comparison to j to permit a convergent reciprocal series? Here, we have a sort of a connection to the harmonic series. Consider the case when $n = O(j^2); j \to \infty$. Then we could choose all $m_k$ to be about the average value, which is about $\frac{j^2}{j} = j$, and we end up summing j of them. Then our reciprocal sum would look like $\sum_{k=0}^j \frac{1}{j} = 1$, which is interesting. If $n = O(j^{1.99})$, the average looks to be something like $j^{0.99}$, the reciprocal sum looks like $\sum^j \frac{1}{j^{0.99}} = j^{0.01}$, and so as $j \to \infty$, the reciprocal sum diverges. Similarly, if $n = O(j^{2 + \epsilon})$ for some $\epsilon > 0$, we get that the reciprocal sum converges.
I think this is interesting, and that's why I responded. Ultimately, I don't know if this is the flavor of response you were looking for. -David