Don't "alternative facts" and other lies just drive you nuts? How can so much junk be published every day? How can so many people fall for all of this? If we could just show them the facts, they'd see how wrong they are!
Sound familiar? Absolutely!
The funny thing is, this conversation occurs daily not just in progressive, Democratic households … the very same thing happens in conservative, very Republican ones.
We're all convinced that the other side is ill-informed, even stupid.
Unfortunately, the real truth may be even worse: both sides are ill-informed, maybe even stupid. But what I find most distressing is it applies to science oriented issues. We all seem to think we have "science" on our side, and the other side has only "junk science", or "politically motivated science".
Sadly, that doesn't appear to be the case at all.
Not only that, but the public as a whole is repeatedly deceived about science by highly motivated parties.
That's bad news … and I've got even more bad news below. Fortunately, I've also got some good news. Some recent research suggests a possible way out of this. The answer isn't that we need more and better dissemination of scientific information. Instead, we need to figure out a way to increase our scientific curiosity, which isn't the same. More about that below.
The good news is that we humans usually are very curious. After all, "inquiring minds want to know"! Which, curiously, may suggest that the way to overcome some of our political divides isn't to provide more "scientific facts". Instead, maybe we just need to tell better stories. Could we all learn something from the "National Enquirer" and TMZ?
But before getting to the really good news, let's take a look at some sad history. A few years ago the term "agnotology" was coined. That's the study of culturally induced ignorance or doubt, particularly in the publication of misleading or inaccurate scientific data. The classic case of this is what Big Tobacco did to counteract scientific information, first appearing in the early 1950's, that smoking was dangerous to health. The Big Tobacco companies carried on a half century effort against those who sought to point out the health dangers of smoking. They were masterfully successful.
Many say that other organizations have been keen students of Big Tobacco. These include Big Oil (trying to call climate change into question); and the National Rifle Association (trying to defend gun ownership). I mention these as examples … there are certainly many more, and on both ends of the political spectrum.
Lots of people get frustrated by these cases. They're often muttering under their breath, "why can't these people be stopped?" The same people reasonably believe that the prevarications and mis-representations of any extreme group they happen to disagree with can be exposed with facts, and the battle will be won. Sadly, not the case.
Tim Harford, known as the "Undercover Economist", wrote about the problem several months ago in the Financial Times. He observed three problems with what I'll call the "fighting lies with facts" strategy. First, a simple untruth can beat off a complicated set of facts, just by being easier to understand and remember. Sounds terrible, but it really does make sense. The truth behind many things is complicated and hard to remember. Soundbites … even patently patently false ones … are much easier to comprehend and remember.
Now for what I think is the scary corollary of this. According to Harford, there's evidence that repeating a false claim, even as part of debunking the false claim, can make the false claim stick. Politifact, a movement to check the veracity of political pronouncements, may actually be its own worst enemy. Politifact's term for an egregious untruth spoken by a politician is "Pants on Fire" … you no doubt know where that phrase came from. Based upon what Harford is saying, every time Politifact attempts to shed the light of truth on a "Pants on Fire" comment, it may inadvertently be cementing the idea in the mind of the public even more.
No one understands this concept better than the President of the United States. In fact, some people have made the argument that "the fact checkers are Trump's poodle". Ouch!
The second argument that Harford makes is something all of us learned in school: facts can be boring. Not only that, the facts may be so boring that an awful lot of people just tune out. There's some concern that news organizations slant their news to fit a particular viewpoint. No doubt, there's some truth to that, but even a quick review of newspapers from the 18th, 19th and early 20th centuries shows that certainly isn't anything new. After a brief perusal of some old newspapers, you could easily come away thinking that today's reporting actually pretty balanced!
No, the problem with facts could actually be worse. It may be that lots of people simply aren't getting any facts at all. In 2016 researchers Seth Flaxman, Sharad Goel and Justin Roe published a study of how people read news online. The objective was to study the online news reading habits of 1.2 million people and assess bias in news reporting. Unfortunately, their sample of 1.2 million people ended up reduced to 50,000. The sad truth? Only about 4% of the 1.2 million people in their study read enough serious news to be included in the study! You may ask, how much serious news did one have to read in order to qualify to be in the 50,000 sample? It was 10 news articles and 2 opinion pieces over 90 days. That's less than one news story per week! In citing the study, Harford noted: "Many commentators worry that we're segregating ourselves in ideological bubbles, exposed only to the views of those who think the same way we do. There's something in that concern. But for 96 per cent of these web surfers the bubble that mattered wasn't liberal or conservative, it was: 'Don't bother with the news.'" Double ouch!
Harford's third argument is that the truth can be threatening. He observed: "The problem here is that while we like to think of ourselves as rational beings, our rationality didn't just evolve to solve practical problems, such as building an elephant trap, but to navigate social situations. We need to keep others on our side. Practical reasoning is often less about figuring out what's true, and more about staying in the right tribe." Harford cites a classic 1954 study called "They Saw a Game". Researchers from Dartmouth College (my alma mater) and Princeton studied a football game between their respective schools played on November 23, 1951. The reaction to the game was largely colored by one's school loyalty. Needless to say, the researchers found the Dartmouth students overlooked fouls committed by the Dartmouth players and complained about how some of the penalties assessed against their team. Princeton supporters did the exact reverse. Anyone who watches sports in the early 21st century will say, "that's a flash of the blindingly obvious"! Needless to say, our tribal affiliations trump our scientific objectivity..
If Harford and other researchers are correct, we really shouldn't be surprised that "determined obfuscators" (my term) like Big Tobacco on smoking, and Big Oil with climate change, are successful in their efforts.
Which is all bad, but it actually gets worse before it gets better. According to Dan Kahan, a professor of law and psychology at Yale University, "groups with opposing values often become more polarized, not less, when exposed to scientifically sound information." Climate change is a perfect example. Liberals seem to believe that if conservatives would pay attention to the scientific facts about global warming, they'd "see the light." Guess what? When scientifically literate conservatives are presented with the facts liberals want them to see, they actually become even more opposed to arguments about global warming! So much for "the facts".
All of which seems to explain a lot of what we observe. The question is, can anything be done about it? Can we somehow not be taken in by companies trying to mis-lead us? Can we show more interest in scientific matters? Can we overcome our boredom with facts? Can we place scientific objectivity ahead of our tribal loyalties? Can we somehow reduce the polarization, especially about matters of science?
Now for the good news I promised you earlier. The answer is, yes, there may be a way to overcome this. Dan Kahan, the Yale professor cited earlier, wrote an interesting article early in 2017 suggesting a way. Kahan and his fellow researchers concluded that increasing scientific literacy isn't the way to do it: "higher proficiency in science comprehension accentuates identity affirming rather than truth convergent forms of political information processing." I guess that's a professorial way to say, people who are scientifically literate are just as likely, even more likely, to base their science views on their tribal and political affiliations, not science.
So if scientific literacy isn't the solution, what is? According to Kahan and his associates, the answer has to do with "scientific curiosity." Scientific curiosity isn't the same thing as scientific literacy. In fact, one doesn't have to have a lot of science training to be scientifically curious. Moreover, Kahan also found that people who are scientifically literate aren't necessarily highly scientifically curious.
Kahan and his team created what they call a "scientific curiosity scale." People who are "scientifically curious" are more willing to set aside tribal and political affiliations and be more objective.
Let's assume Kahan and his fellow researchers are correct. If so, then the key may be to increase scientific curiosity. How do you do that? That will likely take more research. The good news, however, is that humans are naturally curious … and there are some things about which we're probably all very curious. You probably won't like the answer, but it seems that we're almost uniformly curious when it comes to sex, scandal, and gossip. Other things, too, but those are the first things that come to mind.
You can't explain "The National Enquirer" or the Kardashians any other way. That doesn't sound like it has anything to do with big science … or any other truly important issues, except it really does … IF you buy into Kahan's argument that the key is to increase curiosity. Somehow, some way, we don't necessarily need to present more scientific facts, we need to find a way to encourage people to be more curious. Harford notes, "We journalists and policy wonks can't force anyone to pay attention to the facts. We have to find a way to make people want to seek them out. Curiosity is the seed from which sensible democratic decisions can grow. It seems to be one of the only cures for politically motivated reasoning but it's also, into the bargain, the cure for a society where most people just don't pay attention to the news because they find it boring or confusing."
So maybe we're all ill informed and stupid, just not for the reasons for which we accuse each other. And the solution isn't what we think.
If we really want to persuade our opponents to reconsider their science views, the key may to increase our scientific curiosity. And to learn about how that might be done, maybe our first step should be to pull a page out of TMZ's playbook, or learn something about how it's done to us in the supermarket checkout line.