PDF Print

Example to a Maximum Likelihood estimation

Example to a Maximum Likelihood estimation

Let us take N number of yi measured datapoint, that are indepedent but were produced by the same process which can be described by a normal distribution $ \wp \left ( y_{i}\mid\mu ,\sigma  \right) =\frac{1}{\sqrt{2\pi \sigma ^{2}}} e^{-\frac{\left ( y_{i}-\mu \right )^{2}}{2\sigma^{2}}} with $ \mu expected value, and$ \sigma variance.

The joint probability density function of the independent yi events is the product of the individual pdf's:
$ \wp \left ( \mathbf{y}\mid\mu ,\sigma  \right) =\prod_{i=1}^{N}\frac{1}{\sqrt{2\pi \sigma ^{2}}} e^{-\frac{\left ( y_{i}-\mu \right )^{2}}{2\sigma^{2}}}=\left [\frac{1}{\sqrt{2\pi \sigma ^{2}}}  \right ]^{N} e^{-\sum_{i=1}^{N}\frac{\left ( y_{i}-\mu \right )^{2}}{2\sigma^{2}}}

The log-likelihood function reads:
$  \ln \wp \left ( \mathbf{y}\mid\mu ,\sigma  \right) =\frac{N}{2}\ln \left ( \frac{1}{2\pi \sigma ^{2}} \right )-\sum_{i=1}^{N}\frac{\left ( y_{i}-\mu \right )^{2}}{2\sigma^{2}}

We now look for the maximum by differentiating and setting the result equal to zero:
$ \partial _{\mu } \ln \wp \left ( \mathbf{y}\mid\mu ,\sigma  \right) =-\sum_{i=1}^{N}\frac{\left ( y_{i}-\mu \right )}{\sigma^{2}}=0
from that our estimation for $ \mu is:
$ \widehat{\mu}=\sum_{i=1}^{N}\frac{ y_{i} }{N}
the average of the measured data. The partial derivatives according to $ \sigma:
$   \partial _{\sigma }\ln \wp \left ( \mathbf{y}\mid\mu ,\sigma  \right) =-\frac{N}{\sigma}-\sum_{i=1}^{N}\frac{\left ( y_{i}-\mu \right )^{2}}{\sigma^{3}}=0
from that:
$ \widehat{\sigma}^{2}=\sum_{i=1}^{N}\frac{\left ( y_{i}-\mu \right )^{2}}{N}

For completeness we mention that the formula for the variance we still have an unknown $ \mu, and we may get tempted to estimate it by $\widehat{\mu}. As it is well known this would result in a biased estimate, that can be corrected by a factor of N/(N-1).

Now we proceed with estimation theory and the Bayesianestimators.

Site Language: English

Log in as…