One has $ \mathrm P(Y_i=X_n) = \int_{0}^{\infty} {n-1 \choose i-1} (1-\mathrm e^{-x})^{i-1} \mathrm e^{-(n-i)x} \frac1\theta \mathrm e^{-x/\theta} \, \mathrm dx. $ The change of variable $\mathrm e^{-x}=s$ yields $ \mathrm P(Y_i=X_n)= \frac1\theta{n-1 \choose i-1}\int_{0}^{1} (1-s)^{i-1} s^{n-i} s^{1/\theta} \, s^{-1}\,\mathrm ds, $ that is, $ \mathrm P(Y_i=X_n)= \frac1\theta{n-1 \choose i-1}\mathrm{B}(i,n-i+1/\theta)=\frac1\theta\,\frac{\Gamma(n)}{\Gamma(n+1/\theta)}\,\frac{\Gamma(n-i+1/\theta)}{\Gamma(n-i+1)}. $ Sanity checks:
(i) When $\theta=1$, the distribution is uniform on $\{1,2,\ldots,n\}$.
(ii) When $\theta\to0$, the distribution concentrates on $i=1$.
(iii) When $\theta\to+\infty$, the distribution concentrates on $i=n$.
(iv) (More involved) For every $n\geqslant1$ and $a\gt0$, $ \sum_{k=0}^{n-1}\frac{\Gamma(k+a)}{\Gamma(k+1)}=\frac{\Gamma(n+a)}{a\Gamma(n)}. $ Using this identity for $a=1/\theta$, one sees that the sum of $\mathrm P(Y_i=X_n)$ over $i$ is $1$... as it should.