Fake news nearly started a war between Qatar and its neighbors in 2017. In Pakistan, a highly placed official bought into a fake news story warning that Israel was going to destroy Pakistan, and tweeted a warning at Israel that his country, too, was a nuclear power. And in Washington, D.C., an armed vigilante burst into a pizzeria and fired three shots, thinking he was bringing down a sex-slave ring.
While news has never been neutral, something has changed: Information has become weaponized. What’s changed, says Washington State University communications professor Doug Hindman, is that the marketplace of ideas has broken down under the pressure of the internet, and its algorithm-driven behemoths, Facebook, Twitter, Google, and other ad-selling social media and search platforms.
The marketplace of ideas, Hindman says, was the place where “truth and falsehood grappled,” with truth usually winning. “That’s why we have the First Amendment. We want all ideas to compete.” Now, though, “we’ve got a marketplace that’s flooded with bad, viral information.”
While far from the only symptom of a politically divided culture, the rise of fake news threatens to deepen the rift and, potentially, pull democracy into the crevasse with it. How we get out is unknown, but a new initiative from WSU offers a ray of hope.
Rebecca Donaway, a communications doctoral student, spends much of her time researching social media. She has some insight into how the social media platform’s algorithmic decision-making process is driving this avalanche of fake news.
“What creators of content want to do is create sharable media,” Donaway explains. When we repost something on Facebook, that “share” is algorithmically weighted more than likes or comments. The reason is simple: When we share, we expand the reach of ads that clutch the post’s coattails.
Shareable, however, does not mean thoughtful. Thought, in fact, is the enemy of the “sharing” economy. What’s shareable is that which appeals to the reptilian brain that knows only emotion.
The primary emotion that can be quickly—and thoughtlessly—shared, Donaway continues, is outrage. “The things I want to share are these things that I have solid ideas about. ‘This person’s rude; can you believe that jerk?’”
“This is high-emotion, low-information stuff,” Hindman adds.
Donaway nods. “As opposed to some white paper that is breaking down the tax bracket implications for the middle third—something that is really dense and hard for me to create a quippy little post about.”
As criticism of social media algorithms ramps up, Donaway says, “Facebook is just starting to ask, ‘Is this information credible?’ I was a teaching assistant for Doug last semester and in the first week we asked students, ‘Where do you get your news?’ And they told us, Facebook. We said, ‘Well, that’s not a source—what page on Facebook are you going to?’ Well, whatever is in my feed…
“Credibility is a hard thing to teach. Maybe people should take a three-day detox,” Donaway suggests. “How would you get information if you weren’t on Facebook? Maybe it’s worth finding out. Exercise that information-seeking muscle!”
It’s not just the marketplace of ideas that has broken down, Hindman says, but our faith in institutions. Trust in university research is deeply divided between conservatives and liberals, as is trust in almost every sort of non-partisan expertise or shared institution of governance.
Michael Caulfield, director of networked and blended learning at WSU Vancouver, has thought a lot about how to beat back the tide of fake news. He, too, shares Hindman’s concern that we’re plagued by what every authoritarian leader knows. “When people are faced with a flood of misinformation, when they feel that they can’t trust anything, they start to gravitate to what is convenient to believe. And when you remove truth from the equation, all that is left is power,” he says.
“Look at propaganda, which many people think is purely about getting people to believe certain things, but more often than not it’s about getting people to distrust one another.”
Caulfield is working to change the way students are introduced to information literacy. He wants to empower them to put their feelings on hold and discern—with a little fact checking—truth from deception.
“So much of information literacy in the past has been a gotcha: ‘Hey, we’re gonna show you this thing and you’ll debunk it!’ We think we’re teaching them techniques, but the lesson they’re learning is that everything is junk. And once you think everything is junk you become a very easy person to manipulate. There’s no one more easily manipulable than a cynic,” Caulfield says.
“If people feel that there is no way to separate truth from fiction it breeds a sort of cynicism that creates the worst type of government imaginable.”
Instead of cynics, Caulfield is trying to empower students to be “enlightened skeptics.” He calls this approach “the better angels model.” There’s no denying that “there are many parts of us that react in horrible and predictable ways to these click-baity and outrageous stories. But we do have empathy; and we do have the ability to step back and look at things in a way that is disconnected from our own interests.”
Caulfield and his colleagues in other universities are also empowering students by having them write web content. They avoid hot-button issues in favor of less emotional ones—but that still give students a chance to exercise their information-seeking muscles.
“You can sit around all day and complain about the quality of information” on the web, Caulfield says, but we have a civic responsibility to, so to speak, green the information commons. Students researched and, with expert mentors, peer reviewed pieces on “niche” topics, such as “are bald men sexier?” and “does music increase IQ?” where there is a lot of low-quality information.
The results have been gratifying. Several student-produced pieces have shouldered their way to the first page of Google results.
“If you can imagine this spreading to many institutions, you’re no longer talking about impacting the search results of questions like ‘are bald men sexier,’” says Caulfield. “You’re talking about higher education reclaiming its place of impacting the information environment.”
How to become information literate? Read about the four moves and a habit.
Correction from the print version of the story: In the printed version of this story, the communications doctoral student was misidentified as “Rebecca Calloway,” rather than Rebecca Donaway. The photo and references in the story have been corrected in the version on this page.