chapter three
3 The algorithm of estimation: Ronald Fisher’s likelihood principle
This chapter covers
- Ronald Fisher’s On the Mathematical Foundations of Theoretical Statistics (1922) and its role in establishing a rigorous basis for statistical inference
- The development of maximum likelihood estimation (MLE) as a general method for deriving parameter values from observed data
- Fisher’s criteria for evaluating estimators—consistency, efficiency, and sufficiency—and their foundational role in modern inference
- The rejection of prior-based reasoning in favor of inference grounded solely in the observed sample and likelihood function
- The enduring impact of Fisher’s framework in contemporary statistics and machine learning
By the early 20th century, probability and inference had advanced far beyond Bayes’ time, yet foundational problems remained unresolved. How could unknown parameters be estimated reliably? What distinguished a sound and repeatable method from an arbitrary one? In his 1922 essay On the Mathematical Foundations of Theoretical Statistics, Ronald A. Fisher introduced a transformative answer: the likelihood principle. He proposed that the most credible parameter values are those that make the observed data most probable under a specified model. This insight gave rise to maximum likelihood estimation, a method that became one of the central tools of statistical inference—and remains so to this day.