Overview
On this page, we discuss the method of Maximum Likelihood Estimation to estimate an unknown population parameter from a single sample.
Basic learning objectives
These are the tasks you should be able to perform with reasonable fluency when you arrive at your next class meeting. Important new vocabulary words are indicated in italics.
-
Understand the basic idea of maximum likelihood estimation: to guess the value of a parameter, given some observed sample data, one determines the value that makes that sample data as likely as possible to occur.
-
Know the intuitive meaning and precise definition of the likelihood function. This function has as its inputs a sample and a parameter value. Its output is the probability the sample occurs when the parameter takes that value.
Advanced learning objectives
In addition to mastering the basic objectives, here are the tasks you should be able to perform after class, with practice:
-
Be able to find the maximum likelihood estimate and maximum likelihood estimator for a parameter of a given distribution. Don’t forget to show that the maximum likelihood is indeed achieved.
-
Know that when the value of one parameter is determined by another, then the maximum likelihood estimator for one will immediately yield the maximum likelihood estimator for the other.
To prepare for class
-
Watch the following video (by StatQuest / Josh Starmer) which explains the principle of Maximum Likelihood Estimation:
-
Watch the following video (by Brendan Cordy) which goes through a detailed example of a maximum likelihood estimation for a geometric distribution, using the log-likelihood and the technique of logarithmic differentiation, and also shows how to find the maximum likelihood estimator:
After class
-
Watch the following video (by StatQuest / Josh Starmer) which clarifies the difference between probability and likelihood: