I'm taking a graduate course in probability and statistics using Larsen and Marx, 4th edition and looking specifically at estimation methods this week. I ran into a homework problem that is related to moment generating functions and I can't quite connect the dots on how they arrived at the solution.
If you have three independent random variables $Y_{1}, Y_{2}, Y_{3}$ and you would like to determine the moment-generating function of $W = Y_{1} + Y_{2} + Y_{3}$ knowing that each of the three independent random variables have the same pdf $f_{y} = \lambda y e^{-\lambda y}, y \geq 0$
The easy part of the this problem is applying the theorem that says for $W = W_{1} + W_{2} + W_{3}$ the moment generating function of the sum is: $M_{W}(t) = M_{W_{1}}(t)* M_{W_{2}}(t)* M_{W_{3}}(t)$
Where I run into trouble is getting the individual moment generating functions for the Y's. The problem directs you to apply yet another theorem where you would let, for example, another random variable V equal to $aY_{1}+b$ and it follows that $M_{V}(t) = e^{bt}M_{W}(at)$
The solution states that if you allow $V = (1/\lambda)*W$ then the pdf of V then becomes $f_{V}(y) = ye^{-y}, y \geq 0$ and subsequently, you can get the moment generating function using a simple integration by parts but I can't quite follow the application of the theorem used to get to the pdf of V.
Any insight? Likely a fundamental property I missed along the way...