David Cox (Birmingham, United Kingdom; 1924) studied mathematics at St. John’s College (University of Cambridge), and obtained his PhD from the University of Leeds in 1949. At the start of his career, he was employed at the Royal Aircraft Establishment (1944-1946), the Wool Industries Research Association (1946-1950), and the Statistical Laboratory at the University of Cambridge (1950-1955).
He then moved to the University of London, where he was Reader and then Professor of Statistics at Birbeck College (1956-1966) and Professor of Statistics at Imperial College of Science and Technology (1966-1988), for a time heading its Department of Mathematics (1969-1973). In 1988, he was appointed Warden at Nuffield College, Oxford University, and joined the University’s Department of Statistics. He continues to work at Oxford despite formally retiring in 1994.
His scientific paper “The Regression Analysis of Life Tables” (1972) revolutionized the theory and practice of statistical methods for medical research and is the second most cited in the statistics area, with some 30,000 Web of Science citations and 42,000 in Google Scholar. This same article won him the Kettering Prize and Gold Medal for Cancer Research in 1990, the only time this honor has gone to a mathematician. In 2014, Nature placed it 16th in its list of the top 100 scientific papers of all time in any discipline. The influence it has exerted was also recognized in the bestowal on its author of the first International Prize in Statistics in 2016.
Cox is an honorary member of more than forty universities and learned societies, among them the Royal Society of London, which granted him its Copley Medal in 2010, the British Academy and the U.S. National Academy of Sciences. He also holds the Guy Medal of the UK’s Royal Statistical Society (Silver in 1961 and Gold in 1973), the Weldon Memorial Prize from the University of Oxford (1984) and the Max Planck Forschungspreise (1992), and was President of the International Statistical Institute from 1995 to 1997. He received a knighthood for his achievements in 1985.
Speech
Basic Sciences 9th edition
Micro interview
“In today's world, we scientists must do our best to propagate a search for the truth”
For mathematician Bradley Efron, statistics is the “supporting actor” of science, but no less important for that. He means by this that statistical methods rarely occupy the spotlight, which is reserved logically enough for the big discoveries. But then, discoveries come from data, and data need interpreting… with statistics. Few advances would be possible without the aid of this apparently unglamorous branch of science. And the BBVA Foundation Frontiers of Knowledge Award in Basic Sciences bestowed on Efron, Professor of Statistics at Stanford University (United States), and his colleague David Cox of the University of Oxford (United Kingdom) acknowledges its vital contribution to results obtained in multiple domains from medicine to astrophysics by way of genomics or particle physics.
In today’s world of Big Data, when not only science but also technology and economics feed off vast quantities of data, Cox and Efron’s work takes on even more importance. “The problems confronted by modern science involve working with larger and larger data sets, and that creates noise,” remarks Efron. “Now to get to the important information we need to eliminate a lot of noise, and it is statistics that lets us do that.”
Cox and Efron published their most celebrated contributions in the 1970s. In Cox’s paper “Regression Models and Life-Tables” (1972), he explains the workings of a powerful tool to estimate the time interval between two events dependent on identifiable factors. The Cox model can tell us, for instance, about the mortality risk of patients under treatment, the school dropout rate in a given population, or the risk of business bankruptcy. It finds use in cancer research, epidemiology and sociology; in testing the durability of industrial products and even predicting the likelihood of earthquakes.
At the time of developing his model, David Cox was already a reputed researcher. Trained at the University of Cambridge, his move into statistics was due to military imperatives at the time of the Second World War. Before earning his PhD, he had been employed at the Royal Aircraft Establishment, and he afterwards worked for the Wool Industries Research Association. His subsequent career would take him to the University of Cambridge, the University of London, Imperial College London and, since 1988, the University of Oxford.
The story of the Cox model started with a question posed independently by various friends studying patient survival time: How can we tell how much the treatment applied is influencing survival compared to other features inherent to each patient? “It took me three or four years to develop a solution,” says Cox, “and finally my work got published in an academic journal. Rather to my surprise, lots of people found it useful, which I am very happy about.” In his view, the most interesting applications of his research have to do with organ transplants and the treatment of life-threatening diseases like cystic fibrosis.
Bradley Efron (Minnesota, United States; 1938) has also interacted closely with biomedicine – his Stanford professorship is in the Department of Biomedical Data Science at the university’s medical school. However, what remains his seminal contribution finds application in almost every realm of science. The bootstrap, as it is known, is a statistical technique to determine such a vital scientific variable as the margin of error of a given measurement.
Efron, whose interest in statistics was sparked by his father’s fondness for sports rankings, was looking for a way to determine the accuracy of a result without taking repeat measurements – impossible in many cases like, for instance, invasive medical tests. The solution he found was so simple in appearance that it initially met with distrust: “It seemed like cheating and it wasn’t obvious that it would work,” Efron recalls. Not long after, its utility was endorsed in literally thousands of papers. The crux of his idea was to gather data at random from the only available sample, and subject them to analysis. The same process is then re-run many times over, and it is this random re-sampling that provides the margin of error. This is a technique that hinges on the use of computers, which owe their preponderance in statistics to Efron’s work. With computers, re-sampling could be done time and time again, fine-tuning the precision of outcomes.
The term “bootstrap” is itself a clue to Efron’s effervescent talent. Searching for a name at least as catchy as the other statistical tool called the jackknife, Efron took his inspiration from the adventures of Baron Munchausen. In one story, the Baron saves himself from drowning by hoisting himself up by the straps of his boots, an appropriate image for a technique that relies on the data per se, without external inputs.
Bootstrapping was described in a paper published in 1979. Until then, the margin of error had been worked out by mathematical approximations “which could be very complex and were not always correct,” explains Efron. “With the bootstrap, you delegate the hard part to the machines.”