“Things We Know To Be True”: The Death Of Facts In An Age Of “Truthiness”
According to columnist Rex Huppke, there was a recent death that you might have missed. It wasn’t an actor, musician or famous politician, but facts.
In a piece for the Chicago Tribune, Huppke says facts – things we know to be true – are now dead.
Huppke says the final blow came on Wednesday, April 18, when Republican Rep. Allen West of Florida declared that about 80 members of the Democratic Party in Congress are members of the Communist Party.
“That was the death-blow for facts,” Huppke tells weekends on All Things Considered host Guy Raz.
One call to the Communist Party USA confirmed that this was, in fact, not true. According to them, no one in the U.S. House of Representatives is a member of the Communist Party. Days later, Allen West stood by his comments.
So that led Huppke to the idea that if someone of any political party can say something so patently untrue and stand by it — which seems to happen more and more often, he says — then facts must be meaningless and dead.
“[Facts are] survived by rumor and innuendo, two brothers, and then a sister, emphatic assertion,” he says. “They’re all grieving right now, but we wish the best for them.”
There’s another sibling that may be too busy thriving to grieve. Comedian Stephen Colbert coined the term “truthiness” as the notion that truth doesn’t lie in books and facts but rather, in your gut. If Huppke is right and facts are indeed dead, perhaps Colbert’s satire is our reality. Where does that leave those of us seeking the truth?
If Facts Are Dead, How About Fact-Checking?
Bill Adair is the editor of PolitiFact, a website run by a team of seasoned journalists that checks facts made by members of Congress, the White House and interest groups. Despite Huppke’s obituary, he tells NPR’s Raz that the market for fact-checking remains strong.
“Whether the fact has actually died or is just on its death bed, I think it means it’s a great time to be in the fact-checking business,” Adair says, “because there are just so many questions about what’s accurate and what’s not.”
PolitiFact’s fact-checking process is long and arduous. The team spends a lot of time researching whether a fact is true, half-true or not at all true, then posts their findings to the site. When it’s over, however, the team at PolitiFact — and even some Pulitzer Prize-winning journalists — can’t always convince people what is true.
Adair often gets emails accusing them of being biased, but he says he’s not sure who they’re supposed to be biased in favor of because they get criticized a lot by both sides.
“I think that’s just the nature of a very rough-and-tumble political discourse,” he says. “We are in a time when there’s more political discourse than ever … and when you hear somebody say your team is wrong, almost like a referee, you’re going to argue with the ref. You’re going to say the ref is biased.”
The ‘Backfire Effect’
Increasingly, people don’t just say the referee is biased, they say the referee is outright lying.
Dartmouth political scientist Brendan Nyhan, and a colleague of his, Jason Reifler, conducted an experiment where they had people read a mock new article about President George W. Bush.
The article quoted Bush as saying his tax cuts increased government revenue, which is false. Some of the participants were then given a second article that had a correction: it said the Bush tax cuts actually led to a decline in tax revenue, which is true.
Those who opposed President Bush were more prone to believing the second article, while those who supported Bush, even after reading the second corrected article, were more likely to believe the first.
Nyhan calls this phenomenon the “backfire effect,” and it affects people of all political stripes.
“In journalism, in health [and] in education we tend to take the attitude that more information is better, and so there’s been an assumption that if we put the correct information out there, the facts will prevail,” Nyhan says. “Unfortunately, that’s not always true.”
In some cases, giving people corrective information about a misconception can make the problem worse, Nyhan says. That’s the “backfire effect,” and it can make them believe in the misconception even more strongly.
While there have been times of less polarization among political elites, Nyhan says there has never been a golden age of factual agreement. People have always believed incorrect things, but what has changed is the way our society is structured.
“That trend toward polarization has exacerbated this divergence in factual perceptions, to the point that it seems like we’ve lost something,” he says.
It’s simply too hard to walk back misconceptions once they’re out in the wild, Nyhan says, whether put there by political elites or another source. If there was a greater reputational price to pay for putting falsehoods out there, he says, perhaps there would be fewer of them in the first place.
“That, to me, is a difficult problem, but certainly an easier one than trying to change human nature,” he says, “which is what you’re talking about when you try to talk about convincing people. It’s just too difficult most of the time.”
By: NPR, NPR Staff, April 29, 2012
No comments yet.