In Defense of Cold Hard Numbers

By Skye Hawthorne ‘22

It is a fact of life that some of the things our society wouldn’t be able to function without – taxes, for instance, or jury duty – are the things most universally hated as well. And chief among these unjustly badmouthed necessities is an indispensable tool in nearly every field imaginable: statistics. Data, facts, figures; they make the core of every sound argument. Without numbers – verifiable, measurable information – all one would have in any debate are groundless assertions and, quite likely, ad hominem attacks. Sure, the meaning of a statistic can be debated. Sure, the method by which the statistic was gathered can be called into question. But once a fact has been established, it wields tremendous power, and often, that tremendous power can be used to do good in the world.

Take, for instance, the issue of phrenology. In the early 19th century, scientists used to tout the idea that different races of people had different skull shapes because the skull changes shape on account of brain size, and people of certain racial groups – blacks and Native Americans, in particular – have smaller brains, a belief that became known as phrenology. This belief was, of course, motivated by racism, and was especially popular in America on account of people's’ desires to hold on to the horrific and cruel practice of slavery. Yet it was hard for people to argue with phrenologists, because very little was actually known about the workings of the brain, and how it related to skull shape, in particular the “bumps” that phrenologists claimed were correlated with behavior and intelligence.

By the start of the 20th century, however, science had shown that all of this was, how shall I put it delicately, bullshit. Intelligence was shown to be unrelated to skull size or shape (the brain fits the container, not the other way around), and furthermore, no significant differences were shown between the skull shapes of people from different races. Granted, phrenology was at one point considered science. But it was bad science, based on flimsy hypotheses and contentious among the scientific community even at the time. And it didn’t fall out of use until other scientists proved it had no merit; until finally, there was a scientific consensus on the matter. Despite the racism that has continued in our society through the present day, statistics and data have driven the racist practice of phrenology out of the mainstream, to the point that even among other pseudosciences, phrenology wields little to no clout.

This is not the only time that statistical and scientific analysis has brought an end to a racist, sexist, or otherwise unjust practice. And the amount of ways that fact-based science can be used for the greater good is staggering. Breakthrough medicines cure formerly fatal diseases every year. Meteorologists are able to predict deadly storms several weeks out when sixty years ago, it would have been unthinkable to predict those same storms several days out. Yet all too often, scientists are villainized, and statistics are dismissed on account of being exactly what they are: cold, hard, numbers.

Sometimes, this characterization is bold and unabashed, such as former British Prime Minister Benjamin Disraeli’s oft repeated quote that there are three degrees of lying: “lies, damned lies, and statistics.” But more often than not, this demeaning of science and data happens on a subtler level, and in a way that has seeped through pop culture for generations. Even in the original Star Wars, a movie that is supposedly science fiction, the central message of the film ends up being that in order to destroy the Death Star, Luke needs to trust the force and his instincts, which are stronger than his computer.

Now don’t get me wrong; I love all the Star Wars movies, with the possible exception of Attack of the Clones (please don’t hate me – just not my cup of tea). I only use it as an example to bring up how pervasive this dismissal of science really is, and how much it seeps into all aspects of culture, even what we might think of as “nerd culture.” And when you consider that we instill people from such a young age with this idea that intuition is more important than reason, it’s no surprise that so much political and scientific misinformation plagues our society. This includes, of course, the issue of fake news, which has caused significant controversy and resulted in several high-profile hearings since the election of President Donald Trump. But perhaps in no issue has it been prevalent for so long as it has with climate change.

Anthropogenic climate change – man-made heating of our atmosphere by emission of greenhouse gases such as CO2, methane, and CFC’s – is established fact. Exactly what percent of warming we’re responsible for is also pretty much established fact. The answer: most, if not all of it.  In fact, most scientists agree the earth was actually on a slow and steady cooling trend in advance of the industrial revolution, a trend known as the little ice age, which was interrupted by the sudden upsurge in factories, railroads, and fossil fuel mining that took place around the turn of the nineteenth century. The concept of heat-trapping gases has been well understood since the early twentieth century, and scientists were warning of potential anthropogenic climate change by the 1950s. By the 1980s, the science had been all but settled, as had the dizzying prognosis humanity faced if it failed to curb its emissions.

There was just one problem: people didn’t care. Sure, small island countries were already beginning to see concerning sea level rise. But for most Americans, when they stepped out onto their front porches, they noticed nothing different. Their intuitions told them everything was fine; there was no danger, to them at least, so they were able to ignore everything that scientists were practically screaming at the top of their lungs; the earth was warming, and it meant big, big trouble down the road for society if people failed to act on it. Environmentalists, realizing the magnitude of this challenge, did everything they could to sensationalize climate change over the coming decades. They showed photos and films of polar bears struggling to hold on to dwindling icebergs, rainforests decimated all across the planet, and of course, photos and videos of various victims of storms, fires, and other climate-exacerbated disasters, many of them children.

And yet the denial has raged on. Sometimes, it’s subtle. Ted Cruz, for instance, insists all the time that no global warming has occurred since 1998. This is strategic -- 1998 was an anomalously hot year -- but it is also blatantly untrue. And yet, due to our collective phobia of statistics, many people would rather just take his word for it than actually look at the data. Other times, denial is much more dramatic, such as when James Inhofe -- chair of the Environment and Public Works Committee, as if the situation wasn’t comical enough -- brought a snowball onto the senate floor, in February, as supposed evidence against anthropogenic climate change.

This denial has been so prevalent because of our innate failure to grasp that a cold day, week, month, or even year in one place is not evidence against climate change. In fact, it is not evidence for or against anything, because it is an empirical observation, not data. But the actual data is out there for literally anyone to see. You’re a click or two away from direct satellite observations, ice core data, tree ring data, thermal radiation data, land-ocean temperature measurements, and a whole host of scientific papers saying that humans are responsible for these changes, and that the time to act is immediately, if not sooner. You’re only a click or two away from the latest UN report, which declares that there is an “urgent and unprecedented” danger posed by climate change. And so are our nation’s leaders and decision makers. The data is out there. I just wish that data were all it took.

NationalWesleyan Arcadia