Register today for our Generative AI Foundations course. Use code GenAI99 for a discount price of $99!
Skip to content

Historical Spotlight: Ronald A. Fisher

In 1919, Ronald A. Fisher was appointed as chief statistician at the agricultural research station in Rothamsted, a post created for him. His work there resulted, in 1925, in the publication of his classic Statistical Methods for Research Workers. An important message of his book was that statisticians needed to be involved at a practical level in scientific experiments of all types. As Donald MacKenzie, in Statistics in Britain, 1865 – 1930, described Fisher’s view:

It was not enough that the scientists should hand their results to the statistician for analysis: experiments (especially large-scale experiments that were difficult to ‘control’) had to be designed by those with statistical expertise.

Fisher worked hard to apply both existing statistical theory and his own new theoretical contributions to the experimental work of the field station. He introduced the efficient Latin Square design for agricultural experiments, illustrated analysis of variance with field experiment results, and used regression to analyze the effect of nitrogen fertilizer on grain yield.

Less well-known is the fact that, in addition to analyzing data from his own experiments, Fisher also grappled with a “huge bulk” (MacKenzie’s words) of pre-existing data at Rothamsted, the product of various experimental results dating back to 1852. While not “big data” in today’s terms, it was logistically challenging, and led to Fisher’s novel use of orthogonal polynomials to fit curves. The data backlog was also the primary reason for Fisher’s appointment (and the creation of the chief statistician job).

Thus it was that the wheat fields 50 miles north of London produced a bumper crop of statistical theory and applications, codified and enshrined in Fisher’s seminal book that remains an icon of statistical science.