Let $X$ be a random variable with the unknown parameter $\lambda$ and the following pdf $f(t)=2\lambda t\cdot\mathrm e^{-\lambda t^2}\cdot\textbf{1}_{[0,\infty)}(t)$ where $\textbf{1}_A(x)$ is an indicator function with $\textbf{1}_A(x)=\begin{cases}1,&\text{if }x\in A,\\0,&\text{else.}\end{cases}$ Let $\vec x=(x_1,\ldots,x_n)$ be a sample of $X$. Determine the maximum-likelihood estimator $\widehat{\lambda}$ such that the following is true for the likelihood-function $\mathcal L(\vec x;\lambda)$: $\forall \lambda\;:\;\mathcal L(\vec x;\lambda)\leq \mathcal L(\vec x;\widehat\lambda)$
For the sake of simplicity my first thoughts were to get the log-likelihood this way: $\mathcal L(\vec x;\lambda)=\prod\limits_{i=1}^nf(x_i)\implies \ln(\mathcal L(\vec x;\lambda))=\sum\limits_{i=1}^n\ln(f(x_i))$ This is the point where I'm stuck: i don't know how to compute the derivative to maximize the function $\frac{\mathrm d \ln(\mathcal L(\vec x;\lambda))}{\mathrm d\lambda}\overset{!}{=}0.$ Any hints on how to derive the sum would be appreciated.