Suppose \(R \sim Rayleigh(\theta),\) then the density of \(R\) is given by (Rice p. 321)
\[f(r \mid \theta) = \displaystyle {r \over \theta^2} exp \left( { -r^2 \over 2 \theta^2} \right)\]
The cumulative distribution function of \(R\) is
\[F_R(r) = 1 - exp \left( {-r^2 \over 2 \theta^2 }\right) \]
Note that \({d \over dr}F_R(r) = f(r \mid \theta)\)
Data consisting of: \[ R_1, R_2, \ldots, R_n\]
are i.i.d. \(Rayleigh(\theta)\) random variables. The likelihood function is
\[\begin{array}{lcl} lik(\theta) &=& f(r_1, \ldots, r_n \mid \theta)= \prod_{i=1}^n f(r_i \mid \theta) \\ &=& \prod_{i=1}^n \displaystyle \big [ {r_i \over \theta^2} exp \left( { -r_i^2 \over 2 \theta^2 } \right ) \big ] \\ \end{array} \]
The log-likelihood function is
\[\begin{array}{lcl} \ell(\theta) &=& \log [lik(\theta)] \\ &=& [\sum_1^n log(r_i)] - 2n log(\theta) - {1 \over \theta^2} \sum_1^n [r_i^2/2] \\ \end{array} \]
The mle solves \({d \over d \theta} \ell(\theta) =0\):
\[\begin{array}{lcl} 0 &=& {d \over d \theta}(\ell(\theta)) \\ &=& - 2n ( {1 \over \theta }) + 2 ({1 \over \theta^3}) \sum_1^n [r_i^2/2] \\ \Longrightarrow \;\;\; \hat \theta_{MLE} &=& ({ 1 \over n }\sum_1^n [r_i^2/2])^{1/2} \\ \end{array}\]
The first moment of the \(Rayleigh(\theta)\) distribution is
\[\begin{array}{lcl} \mu_1 &=& E[R \mid \theta ] = \int_0^\infty r f(r \mid \theta ) dr \\ &=& \int_0^\infty r { r \over \theta^2} exp({-r^2 \over 2 \theta^2}) dr\\ &=& {1 \over \theta^2} \int_0^\infty r^2 exp({-r^2 \over 2 \theta^2}) dr\\ &=& {1 \over \theta^2} \int_0^\infty v \cdot exp({-v \over 2 \theta^2}) [ {dv \over 2 \sqrt{v}}] \;\; (\text{change of variables: } v=r^2 \text{)}\\ &=& {1 \over 2 \theta^2} \int_0^\infty v^{{3 \over 2} -1} \cdot exp({-v \over 2 \theta^2}) {dv }\\ &=& {1 \over 2 \theta^2} \Gamma({3 \over 2}) (2 \theta)^{3 \over 2} \\ &=& \sqrt{2} \theta \Gamma({3 \over 2} ) = \sqrt{2} \theta \times ({1 \over 2}) \Gamma({1 \over 2}) \\ &=& \theta \times {\sqrt{\pi} \over \sqrt{2} }\\ \end{array} \]
(using the facts that \(\Gamma(n+1)=n\Gamma(n)\) and \(\Gamma({1 \over 2}) = \sqrt{\pi}\))
The MOM estimate solves:
\[\begin{array}{rcl} \mu_1 &=& \hat \mu_1 = {1 \over n} \sum_{R_i} = \overline{R}\\ \theta \times {\sqrt{\pi} \over \sqrt{2}} &=& \overline{R}\\ \Longrightarrow \;\; \hat \theta_{MOM} &=& \overline{R} \times {\sqrt{2} \over \sqrt{\pi}} \\ \end{array} \]
The approximate variance of the MLE is \(Var(\hat \theta_{MLE}) \approx { 1 \over n I(\theta)}\)
where
\[\begin{array}{lcl}I(\theta) &=& E[-{d^2 \over d \theta^2}(log(f(x \mid \theta)))] \\ &=& E[-{d^2 \over d \theta^2}[log({ x \over \theta^2} exp(-{x^2 \over 2 \theta^2}) )]]\\ &=& E[-{d \over d \theta}[-2 ({1 \over \theta}) - ({x^2 \over 2}) (-2) \theta^{-3}]]\\ &=& E[ -[ ({2 \over \theta^2}) + ({x^2 }) )(-3) \theta^{-4} ]] \\ &=& 3 \theta^{-4} E[x^2] - ({2 \over \theta^2}) = 3 \theta^{-4} (2 \theta^2) - ( {2 \over \theta^2})\\ &=& {4 \over \theta^2}\\ \end{array}\]
So, \(Var(\hat \theta_{MLE}) \approx { \theta^2 \over 4 n}\)
The MOM estimate
\(\hat \theta_{MOM} = \overline{R} \times {\sqrt{2} \over \sqrt{\pi}} \)
has variance:
\(Var(\hat \theta_{MOM}) = ({\sqrt{2} \over \sqrt{\pi}})^2 Var(\overline{R}) = ({2 \over \pi}) {Var(R) \over n} \)
\[\begin{array}{lcl} Var(R) &=& E[R^2] - (E[R])^2 \\ &=& 2 \theta^2 - ( \sqrt{\pi \over 2} \theta)^2 \\ &=& \theta^2( 2 - {\pi \over 2}) \\ \end{array} \]
So, \(Var(\hat \theta_{MOM}) = \theta^2 (2 - {\pi \over 2}) ({2 \over \pi})({1 \over n}) = \theta^2 ( { 4 \over \pi} -1) ({1 \over n}) \approx {\theta^2 \over n} \times 0.2732\)
This exceeds the approximate \(Var(\hat \theta_{MLE}) \approx {\theta^2 \over n} \times 0.25\)