Suppose that $X_1,\ldots,X_M$ are independent exponential random variables with mean $c$, so that their pdf and cdf are given by $f(x)=c^{-1}e^{-x/c}$ and $F(x)=1-e^{-x/c}$, $x \geq 0$, respectively. Let $Y_M=\max \{X_1,\ldots,X_M\}$. Then, $Y_M$ has cdf $F_M (x) = F(x)^M$ and hence pdf $f_M (x) = M F(x)^{M-1} f(x)$. Thus, the expectation of $Y_M$ is given by $ {\rm E}[Y_M ] = \int_0^\infty {xMF(x)^{M - 1} f(x)dx} = \int_0^\infty {x\frac{M}{c}e^{ - x/c} (1 - e^{ - x/c} )^{M-1}dx}. $ So you want to know why $ {\rm E}[Y_M] = c\sum\limits_{k = 1}^M {\frac{1}{k}}. $ Now, $Y_M$ is equal in distribution to $E_1 + \cdots + E_M$ where the $E_k$ are independent exponentials and $E_k$ has mean $c/k$; for this fact, see James Martin's answer to this MathOverflow question (where $c=1$). The result is thus established.
EDIT: As an additional reference, see Example 4.22 on p. 157 in the book Probability, stochastic processes, and queueing theory by Randolph Nelson.
EDIT: It is interesting to note that $ \int_0^\infty {x\frac{M}{c}e^{ - x/c} (1 - e^{ - x/c} )^{M - 1} dx} = c\int_0^1 { - \log (1 - x^{1/M} )dx} . $ This follows by using a change of variable $x \mapsto x/c$ and then $x \mapsto (1-e^{-x})^M$. So, this gives the following integral representation for the $M$-th harmonic number $H_M := \sum\nolimits_{k = 1}^M {\frac{1}{k}}$: $ H_M = \int_0^1 { - \log (1 - x^{1/M} )dx}. $ Finally, it is both interesting and useful to note note $ H_M = \int_0^1 {\frac{{1 - x^M }}{{1 - x}}dx} = \sum\limits_{k = 1}^M {( - 1)^{k - 1} \frac{1}{k}{M \choose k}}, $ see Harmonic number. With the above notation, the right-hand side corresponds to $ {\rm E}[Y_M] = \int_0^\infty {{\rm P}(Y_M > x)dx} = \int_0^\infty {[1 - F_M (x)]dx} . $