Maximum Likelihood Estimation (MLE)
Maximum Likelihood Estimation (MLE)
Overview
Definition
Maximum Likelihood Estimation (MLE) is a method of estimating the parameters of a probability distribution by maximizing a likelihood function, so that under the assumed statistical model, the observed data is most probable.
1. Concept
Distinction:
- Probability: Given fixed parameters
, what is the probability of observing data ? ( ) - Likelihood: Given observed data
, how likely is it that the parameters are ? ( )
MLE seeks to find
2. Procedure
-
Define the Likelihood Function:
(Assuming independence).
-
Log-Likelihood:
It is computationally easier to maximize the sum of logs than the product of probabilities. -
Differentiate: Take the derivative with respect to
and set to zero. -
Solve: Find
.
3. Properties of MLE
- Consistent: Converges to the true parameter value as
. - Efficient: Achieves the lowest possible variance (Cramér-Rao lower bound) asymptotically.
- Invariant: Functional invariance (MLE of
is ).
4. Related Concepts
- Binary Logistic Regression - Uses MLE.
- Simple Linear Regression - OLS is equivalent to MLE under normal errors.