= p(theory | data) =
London, and his nephew, William Morgan, was the first person
whose professional designation contained the word “actuary.”
Like the calculus, Bayes’ rule was independently discovered
at essentially the same historical moment in two countries.
Pierre Simon Laplace, one of history’s greatest scientists, independently discovered Bayes’ rule, formulated it in more modern
terms than did Bayes, and used it in applications ranging from
estimating the mass of planets to assessing the probability of a
defendant’s guilt given the evidence.
While Laplace is remembered primarily for his contributions to the physical sciences and discovery of the central limit
theorem, he was also an Enlightenment thinker. His probabilistic analysis of court trials was of a piece with his opposition to
capital punishment. McGrayne quotes Laplace’s comment, “The
possibility of atoning for these errors is the strongest argument
of philosophers who have wanted to abolish the death penalty.”
Many decades later, the great mathematician Henri Poincaré used Bayesian inference to help free Alfred Dreyfus, a
French officer falsely accused of spying for the Germans, from
life imprisonment on Devil’s Island. It is ironic that many contemporary lawyers view the Dreyfus Affair as an example of
why probabilistic arguments should be curbed in criminal cases.
Alamos—could be used to bypass the intractable integrations
needed to compute Bayesian posterior probabilities. By providing
working statisticians with a powerful computational tool (if not a
collection of Fisher-style recipes), MCMC has enabled Bayesian
practice to flourish in a wide and expanding array of domains.
Widespread acceptance of Bayes’ updating rule has been a long
time coming. From Laplace’s time to the early 20th century, theorists tended to accept Bayes’ rule only grudgingly, for lack of an
alternative. A protracted period of ambivalence came to an end
when R. A. Fisher published Statistical Methods for Research
Workers in 1925. Fisher’s book outlined many of the statistical
procedures that have become codified in mainstream statistical
education and computing packages: maximum likelihood, analysis
of variance, significance tests using p-values, and so on. Perhaps in
no small part because of Fisher’s emphasis on recipes admitting
of automation, his approach to statistics became hugely influential. Statistical Methods went through seven editions in Fisher’s
lifetime, and his methods became the dominant paradigm of statistical inference. Bayesian inference was all but buried, relegated
to a generally reviled minority status.
The year 1990, however, ushered in a remarkable reversal of fortune and marked the
beginning of a Bayesian revolution that even
today continues to gain force. The key insight was
that Markov Chain Monte Carlo (MCMC)—a simulation technique that emanated from the same group of
research scientists that worked on the atomic bomb at Los
There is more to the story, of course, and McGrayne’s book is
particularly fascinating in the light it sheds on the
diverse ways Bayesian inference has been applied
through the decades by practitioners well outside
the statistical mainstream. Examples include:
■ ■ Bell Telephone scientist Edward Molina used
Bayesian logic to help Bell deal with uncertain
telephone usage given data about call traffic, call
length, and waiting times.
■ ■ Alan Turing, best known for his fundamental
contributions to logic and computer science,
used Bayesian logic to crack the Nazis’ Enigma cipher machine during World War II.
General Dwight Eisenhower based the
timing of the Normandy invasion on an intercepted message from Adolf Hitler that
had been deciphered by Turing’s team.
Eisenhower later estimated that the
work led by Turing had shortened
PAveLiS / iSTOCk