Roger Griffin
Know your data!
You will not have heard of Roger Griffin. Like Arlo Landolt, he was immensely important not for his own scientific results, but for the science they enabled other people to do. And he was the epitome of the hands-on scientist: he knew each data point, where it came from and how much to trust it. In the age of petabyte data sets, we shall not see his like again.
His main project was determining the orbits of double (or multiple) stars from measurements of their radial velocity, that is, how fast they’re approaching or receding from us. If stars are in orbit around each other this varies, and with enough measurements of sufficient accuracy one can figure out an orbit. Under favorable conditions the orbits of both stars pop out, along with their masses (this is the only direct way we know the masses of stars); at the least favorable, there are limits and good guesses. Since the whole edifice of stellar studies depends on the masses of stars, and the whole edifice of galaxy studies depends on stellar studies, Roger was working on the foundation of most of modern astronomy.
But it’s not easy. To get a good radial velocity you need to get a good spectrum, and that means spreading out the light of a star into a much fainter band. That means a big telescope and long observing time. But Roger worked out a technique using cross-correlation, so that he could get good data on even rather faint stars using only the 36-inch telescope at the University of Cambridge, England. Well, coming up with a clever technique and building the instrument are not unusual; it’s what much of astronomy is about. The unusual part is that Roger used the same instrument (and a development of it) for a program that lasted, in round terms, for a half-century. He turned out many hundreds of orbits (and hence masses). No one orbit was likely to shake the foundations of stellar studies, but in mass these results are invaluable.
Each orbit required many data points, and almost all data points were a particular observation with Roger in the dome of his telescope. Some stars were harder than others. A system with a 20-year orbit required patience. A system with an orbit close to a year also required patience, since it would be visible (away from the Sun) for almost the same part of its orbit each observing season. At any one time he must have had hundreds of active targets. It was convenient that he lived only a five-minute bike ride away from the observatory. It was inconvenient that English weather is often uncooperative. It was also inconvenient that the University, in violation of long-standing covenants, installed floodlights for sports fields near the observatory around the turn of the century. (But a brighter sky is seemingly inevitable everywhere.)
So Roger knew each of the data points he obtained, in a way that just won’t happen again. He also understood statistics, and was not patient with other observers who were unclear or careless in their work. Now, it’s always a temptation when some data point just doesn’t fit to exclude it, assuming something untoward happened (even when you don’t know what it could have been). Roger was pitiless in this regard, in a way that (we suggest) every scientist should be. We quote from a paper he wrote in 2013:
There are some unusually bad residuals in Table XII; it is not possible to distinguish between stellar and instrumental origins for them, and it seems better to retain them in the solution of the orbit than to reject them for no better reason that that we don’t like them and wish that they were not there.