Let random variables $X_1, X_2, \dots, X_n$ be the lifetime of the first blade, the second,and so on. Here $n=36$.
The $X_i$ are independent, and therefore the variance of their sum is the sum of their variances. Thus the sum $X_1+\cdots+X_n$ of the lifetimes has variance $8^2+8^2+\cdots +8^2$ ($n$ terms). This sum is $(n)(8^2)$, and hence the variance of the sum is $(n)(8^2)$. The standard deviation of the sum is therefore$(\sqrt{n})(8)$.
We also use an informal version of the Central Limit Theorem. The sum of $n$ independent identically distributed "nice" random variables is "nearly normal" if $n$ is large enough.
The numerical questions: (a) The sample average exceeds $60$ precisely if the sample sum exceeds $(36)(60)$. So in a sense problems (a) and (b) are very much alike.
But there is a slightly different approach to (a) which is probably a bit better. Let $S$ be the sample sum discussed above. Then the sample average $Y$ is equal to $\dfrac{S}{n}$. The random variable $Y$ is also nearly normal. It has mean $40$. The variance of $\dfrac{S}{n}$ is $\dfrac{1}{n^2}$ times the variance of $S$ calculated above. It follows that $Y$ has variance $\dfrac{(n)(8^2)}{n^2}$, that is, $\dfrac{8^2}{n}$. Since $n=36$, $Y$ has standard deviation $\dfrac{8}{6}$.
So we want the probability that a normal random variable with mean $40$ and standard deviation $\dfrac{8}{6}$ exceeds $60$. Since $60$ is a ridiculous number of standard deviation units away from $40$, the probability is nearly $0$.
(b) The mean of the sum is $1440$. We want the probability that a nearly normal random variable with standard deviation $48$ is less than $1250$. This is a standard calculation. Pretty unlikely!
Remark: You may be asking why is the variance of an independent sum equal to the sum of the variances. Here is a proof for a sum of two random variables $X$ and $Y$. The proof readily extends to longer sums.
Let $X$ and $Y$ be independent, with variances $\sigma^2_X$ and $\sigma^2_Y$. Recall that $\text{Var}(W)=E(W^2)-(E(W))^2$. Apply this with $W=X+Y$. We have $E((X+Y)^2)=E(X^2+2XY++Y^2)=E(X^2)+2E(XY)+E(Y^2)=E(X^2)+2E(X)E(Y)+E(Y^2).\tag{$!$}$ (For the fact that $E(XY)=E(X)E(Y)$ we used independence.) Also. $(E(X+Y))^2=(E(X)+E(Y))^2=(E(X))^2+2E(X)E(Y)+(E(Y))^2.\tag{$2$}$ Using $(1)$ and $(2)$ and rearranging a bit, we find that $\text{Var}(X+Y)=E(X^2)-(E(X))^2 +E(Y^2)-(E(Y))^2.$ But the above is just $\sigma^2_X+\sigma^2_Y$.