Estimators

Method of Moments

  • sometimes it can yield non sensical values

Maximum Likelihood Estimators

  • The range of MLE conincides with the range of the parameter.
  • Use derivative to get the maximum or minimum; check second derivative to make sure it’s maximum
  • Invariance property of MLE

Bayes Estimator

  • Normal

$$ \theta ~ N(\mu, \tau^2)
X ~ N(\theta, \sigma^2) $$

Then,

$$ E(\theta | x) = \frac{\tau^2}{\tau^2 + \sigma^2}x + \frac{\sigma^2}{\sigma^2 + \tau^2}\mu
Var(\theta | x) = \frac{\sigma^2 \tau^2}{\sigma^2 + \tau^2} $$

Methods of Evaluating Estimators

  1. Look at MSE

Proofs

Uniform, Max

$$ X \sim Unif(0, \theta) $$

Find the unbiased estimator for $\theta$.

Set

$$ f_X(x) = \frac{1}{\theta} \\ Y = max(x_1, \dots, x_n) $$
$$ \begin{eqnarray} F_Y(y)=P(Y \leq y) &=& P(max(X_1, ..., X_n) \leq y) \\ &=& P(x_1 \leq y, x_2 \leq y, ..., x_n \leq y) \\ &=& \prod_{i=1}^n (X \e y) \\ &=& \left(\frac{y}{\theta}\right)^n \end{eqnarray} $$

Therefore,

$$ f_Y(y) = \frac{\partial}{\partial y} F_Y(y) = n\left(\frac{y}{\theta}\right)^{n-1}\frac{1}{\theta} = n\frac{y^{n-1}}{\theta^n} $$

To get the expected value,

$$ E(Y) = \int_{0}^{\theta} n\frac{y^{n}}{\theta^n} dy \\ = \frac{n}{\theta^2}\frac{y^{n+1}}{n+1} \\ = \frac{n}{n+1} \theta $$

Therefore,

$$ \frac{n+1}{n} Y $$

is a unbiased estimator.