Oct 01, 2024
Project Proposal due Thursday, October 3 at 11:59pm
Lab 03 due Thursday, October 3 at 11:59pm
HW 02 due Thursday, October 3 at 11:59pm (released after class)
Exam 01: Tuesday, October 8 (in class + take-home)
Lecture recordings available until the start of the in-class exam (Link on side bar of webpage)
Exam review on Thursday
Monday’s lab: Exam office hours
No office hours while take-home exam is out
Note
This is not a mathematical statistics class. There are semester-long courses that will go into these topics in much more detail; we will barely scratch the surface in this course.
Our goals are to understand
Estimators have properties
A few properties of the least squares estimator and why they are useful
We have discussed how to use least squares to find an estimator of
How do we know whether our least-squares estimator is a “good” estimator?
When we consider what makes an estimator “good”, we’ll look at three criteria:
We’ll take a look at these and motivate why we might prefer using least squares to compute
Suppose you are throwing darts at a target
Ideal scenario: Darts are clustered around the target (unbiased and low variance)
Worst case scenario: Darts are widely spread out and systematically far from the target (high bias and high variance)
Acceptable scenario: There’s some trade-off between the bias and variance.
Finite sample (
Unbiased estimator
Best Linear Unbiased Estimator (BLUE)
Infinite sample (
Consistent estimator
Efficient estimator
The bias of an estimator is the difference between the estimator’s expected value and the true value of the parameter
Let
An estimator is unbiased if the bias is 0 and thus
The least-squares estimator
What does it mean for a model to be a “linear” regression model?
Linear regression models are linear in the parameters, i.e. given an observation
The functions
Gauss-Markov Theorem
The least-squares estimator of
“Best” means
Suppose
Let
What is the dimension of
We need to show
What assumption(s) of the Gauss-Markov Theorem did we use?
What must be true for
Now we need to find
What assumption(s) of the Gauss-Markov Theorem did we use?
We have
We know that
When is
Therefore, we have shown that
Gauss-Markov Theorem
The least-squares estimator of
“Best” means
Finite sample (
Unbiased estimator ✅
Best Linear Unbiased Estimator (BLUE) ✅
Infinite sample (
Consistent estimator
Efficient estimator
The mean squared error (MSE) is the squared difference between the estimator and parameter.
Let
The least-squares estimator
An estimator
This means that as the sample size goes to
Why is this a useful property of an estimator?
Important
Theorem
An estimator
Now we need to show that
What is
Does
The efficiency of an estimator is concerned with the asymptotic variance of an estimator.
The estimator with the smallest variance is considered the most efficient.
By the Gauss-Markov Theorem, we have shown that the least-squares estimator is the most efficient among linear unbiased estimators.
Finite sample (
Unbiased estimator ✅
Best Linear Unbiased Estimator (BLUE) ✅
Infinite sample (
Consistent estimator ✅
Efficient estimator ✅