Graphs and indicators
Digits can be deceiving.
Last week we raised the question of whether the ubiquity of digital displays has caused a decline in people’s ability to read a graph, or any analog presentation. One can see signs of it in students; but then, one can see signs of almost anything in students. Overall, we think it’s probable.
To be clear at the outset: there is nothing bad about digital displays. In fact they have been a godsend to anyone tasked with recording exact data. Many an astronomer in past days has struggled to make out what his vernier dial actually showed. Similarly, shipboard engineers have had to work out just what number the needle was pointing to. The latter were not trying for the high level of accuracy of the former, but consequences of misreading a gauge could be much more dangerous. The fact that both tended to be tired and sometimes stressed didn’t help. A lot of mistakes were made.
The fact that a number was displayed digitally didn’t necessarily make it more exact, but it did make it easier to read and enter in the log, and harder to make mistakes in doing so. And about the time they became common, electronics went through a vast improvement, so the numbers did become better.
But not everyone needs high accuracy, or high accuracy all the time. Consider a glance at a clock face: it’s a bit over half an hour to 11:00, when the guests arrive. There’s time for a bit of straightening up the kitchen table, and we can put a fresh pot of coffee on, but a full vacuuming of the house is not going to happen. That’s harder to conclude with a clock that says 11:23:04, and you have to do some mental math to get there. And you can see all the way across the engine room that the main bearing oil pressure is good, since the needle is about centered on the dial.
In fact a digital display can be quite misleading. Your clock says 11:23:04, but it’s five minutes fast, so half the digits are nonsense. And from the start, calculators have done the same thing to students (and others). It’s quite common for someone to divide, say, 23 by 7.1 and write down 3.239437, in spite of the fact that the beginning numbers are only known to two digits. In that respect a slide rule was sort of self-correcting: even the best operator could hardly coax more than three digits out of one, which was about as good as any numbers coming from a student lab. In fact three digits is about the limit of any simple analog display, and gives an immediate visual impression of what those extra numbers actually mean. That in itself, we think, is important.
[We are impressed by the ingeniousness of analog inventors trying to get around this limitation. Consider a well-calibrated clock with a sweep second hand: with care, you can record six digits, all significant. And a sextant with a vernier scale can read two digits of degrees, two of minutes and tenths of minutes, a total of five figures. These are base-60 digits, but still impressive.]
Well, slide rules are not coming back. And people are going to tell time by digital displays on their smartphones, not by watch dials on their wrists. But maybe we’ve lost something.
Ken V
February 28, 2022 at 12:21 amThanks, doughboy, I enjoyed that bit of insight.