Let $n \in \mathbb{N}$, $n \ge 3$, $f \in \mathbb{N}$, $f \ge 2$ ($f$ can depend on $n$, but $f \le n$).
How can I use a Taylor serie to develop $\left(1 - \frac{1}{n}\right)^{f \cdot i}$ in the sum $$\sum_{i=1}^{n} \left(1 - \frac{1}{n} \right)^{f \cdot i} \mathbb{P}(X = i)$$ where $X$ is a random variable that has a probability distribution which is not really important here?
I wanted to use that $(1 - x)^\alpha = 1 - \alpha x + O(x^2)$, but $\alpha$ has to be constant, and in my case, we can have $$\left( 1 - \frac{1}{n}\right)^{n^2} = e^{-n-\frac{1}{2} - \frac{1}{3n} + O(1/n^2)}.$$
Could I use something like $$\sum_{i=1}^{n} \left(1 - \frac{1}{n} \right)^{f \cdot i} \mathbb{P}(X = i) \le \sum_{i=1}^n \left(1 - \frac{f \cdot i}{n} + O(1/n^2)\right) \mathbb{P}(X = i)?$$