Suppose that $y_1,\ldots,y_n$ are observations from $Y_1,\ldots,Y_n$. Then the likelihood function is $ L(\alpha,\sigma^2)=\prod_{i=1}^n \,(2\pi \sigma^2)^{-1/2}\exp\left(-\frac{1}{2\sigma^2}(y_i-\alpha x_i)^2\right). $ When $(\alpha,\sigma^2)$ are allowed to vary in $\mathbb{R}\times (0,\infty)$ the likelihood function attains it maximum at $L(\hat{\alpha},\hat{\sigma}^2)$ where $\hat{\alpha}$ and $\hat{\sigma}^2$ are the maximum likelihood estimates, i.e. $ \hat{\sigma}^2=\frac{1}{n}\sum_{i=1}^n(y_i-\hat{\alpha}x_i)^2. $ Under the hypothesis that $\alpha$ is a fixed real number, the likelihood function attains its maximum in $L(\alpha,\tilde{\sigma}^2)$, where $ \tilde{\sigma}^2=\frac{1}{n}\sum_{i=1}^n(y_i-\alpha x_i)^2. $ The likelihood ratio is then (using the hint) $ Q=\frac{L(\alpha,\tilde{\sigma}^2)}{L(\hat{\alpha},\hat{\sigma}^2)}=\frac{(2\pi\tilde{\sigma}^2)^{-n/2}\exp\left(-\frac{1}{2\tilde{\sigma}^2}\sum_{i=1}^n (y_i-\alpha x_i)^2\right)}{(2\pi\hat{\sigma}^2)^{-n/2}\exp\left(-\frac{1}{2\hat{\sigma}^2}\sum_{i=1}^n (y_i-\hat{\alpha} x_i)^2\right)}\\ =\left(\frac{\sum_{i=1}^n(y_i-\alpha x_i)^2}{\sum_{i=1}^n(y_i-\hat{\alpha}x_i)^2}\right)^{-n/2}\frac{e^{-n/2}}{e^{-n/2}}\\ =\left(1+\frac{(\hat{\alpha}-\alpha)^2\sum_{i=1}^n x_i^2}{\sum_{i=1}^n(y_i-\hat{\alpha}x_i)^2}\right)^{-n/2} $ Thus $Q=(1+\frac{1}{n-1}T^2)^{-n/2}$ and $-2\log Q=n\log(1+\frac{1}{n-1}T^2)$.