The best way to stop disinformation from spreading on social media is to prevent sites from sharing aggregate data, says Wharton’s Eric Clemons. This episode is part of the “Research Roundup” series.
Transcript
The Rise of Disinformation on Social Media
Dan Loney: There is no doubt that we have seen social media platforms deal with growing levels of disinformation in recent years, and on a wide range of topics. The reasons can be political, profit-centered, and there are so many other reasons. But how can these issues of disinformation be dealt with? Pleasure to be joined by Eric Clemons, who's professor of operations, information and decisions here at the Wharton School. And he was part of new research on this topic earlier this year.
Eric, let me start with something that is noted right at the top of the research. You talk about campaigns and how much a campaign is involved in much of these issues around disinformation.
Eric Clemons: One of the things that's interesting is how effective it is. Disinformation about climate change is easy. You tell a bunch of coal miners that it's a Chinese plot to cripple the American coal industry, increase the cost of American production, and increase Chinese welfare, Chinese well-being. You tell someone who lives in a coastal community that just because somebody missed seven foul shots, you don't trade them. That stuff happens, and a couple of bad hurricanes in a row doesn't mean you should sell the house or get rid of your SUV. That's easy to say.
The explanation for climate change is so complicated. We all know how a microwave works. The radiation in a microwave makes water vibrate, which is the definition of heat. Radiation from the Earth makes carbon dioxide vibrate, which is definition of heat, why we don't melt. It's called black body radiation. Eventually, you reach a new equilibrium. I gave my class an experiment, which was I told them a couple of climate change lies, which were easy to repeat. And I told them some climate change truth, which was hedged and incredibly complicated. And none of them could repeat it.
I was stuck on an airplane, and the poor passenger stuck between me and the aisle had just bought a house off the Outer Banks. And at the end of two hours, he said, “Let me see if I understand. If I'm very lucky, and every bit of data we have on carbon dioxide is false, and if cloud cover cools the earth, I'm fine. Otherwise, my house is going to be underwater in 10 years.” I said, “Yeah, it could be OK.” But he said, “I'm selling. Why doesn't every American know this?” And I said, “You were stuck for two hours with nothing to do but listen to me. I don't have that opportunity, and I don't have the time to talk to 20 million voters.” But the lie is easy and effective. The truth is long, slow, and complicated. Makes it a very effective way to change opinions.
Loney: When you put it in a campaign form, it becomes much more effective.
Clemons: It does. Now, a lot of the things that are banned in disinformation campaigns are sort of banned in old marketing law. But this is so complicated. It's moved so quickly. Platforms have so many exemptions. And the profit motive to be paid to facilitate a campaign is so high that it's really been almost impossible to control.
Loney: One of the other things you mentioned is that this disinformation, in many cases, is “highly processed,” similar to many of the foods that we eat, correct?
Clemons: I love the analogy. We are finally understanding that highly processed food isn't the same as a balanced diet, and highly processed news isn't the same as an informed electorate. That's exactly right. And I should explain a little bit what “highly processed” means in the context of news. The first part is, “I know what you're inclined to believe. I know what you know.” The amount of information available on every social media user is incredible. Recent studies have shown that social media models can predict the behavior of a user more accurately than their spouse. We know enough.
If we share that information with a highly motivated liar — let's leave the U.S. out of it and call it a North Korean troll factory. They can write stories that will appeal very closely to individual segments of the market without actually needing to violate any individual's privacy. So, I now know enough to have highly processed lying. Very effective, targeted lying. If I send that to everybody, if I told you sincerely that Hillary Clinton ran a child pornography web out of a pizza parlor in Virginia and had dozens of kids chained to the wall, you would either find it amusing because you know it's false, or you would find it terrifying because you know there are people who would believe it. Highly processed news also goes to the people who want to believe it.
Why Disinformation Is Hard to Control
Loney: How different is news that people see on a social media platform from what we see on a network newscast or read in a newspaper?
Clemons: That's changing a little bit since the Bush administration basically got rid of the FCC’s fairness requirement. But still, if The New York Times runs a story, it has to be plausible to all of its readers. If readers find it preposterous, it loses its credibility, which destroys its value as a brand. In the case of social media, the social media's brand is entertainment. And as long as I show stuff that people find entertaining, I'm protecting and enhancing the brand. I'm not weakening it. There's no penalty to Facebook, or no penalty to Twitter, for highly focused, highly targeted lying, whereas this would destroy the brand of The New York Times, or, for that matter, NPR.
Most broadcast channels attempt to have at least some element of universal plausibility. Even Fox News — which is about as biased as a network can be — even Fox News, on their news programs, news segments, attempts a certain amount of reliability. Their editorials are a different story. When I wanted to get unprocessed news about the Middle East, I actually got it from Al Jazeera, because Al Jazeera is so vulnerable to claims of being false that their news is is as reliable as The Financial Times. Their editorials are a different story. I don't spend a lot of time with Al Jazeera editorials. But their news has to be accurate or they have no credit. They lose whatever credibility they have.
Loney: You make a distinction in the paper that you wrote about this topic between misinformation and disinformation. Please take a moment to explain the difference.
Clemons: Sure. If I actually believed that the election was stolen, and I repeat that, or I actually believe that vaccines cause autism, or that Joe Biden was fully competent going into the debate — if I truly believe that, and I repeat that, I post that, I repost that, that's misinformation. If I tell stuff that I believe for the purpose of sharing my view of truth, again, that's misinformation. But if I know something to be false, I'm actually working in a troll factory, or I'm working for a political campaign, and I design really effective arguments to spread falsehood, that's disinformation. Misinformation is protected by the First Amendment. Disinformation is not.
Loney: Why is disinformation so hard to control?
Clemons: A variety of reasons. One is, one man's disinformation is another man's political campaign. I've got to assume that most people in the Trump White House know that he lost the previous election. And yet they went after voting machine companies for fraud. They're fighting to block mail-in voting because it's effective. I've said this before: This is not a Democrat or a Republican issue. It's a liberal or conservative issue. Liberals, by definition, consider almost everything discussable. I don't mean intifada. I don't mean radical anarchists. But by definition, a political liberal, in the definition from the late 1700s, believes everything is discussable.
For the truly committed, you're doing God's work. The guy who said, “I disagree with what you say, but I will defend to the death your right to say it.” He wasn't part of the Inquisition, right? I mean, he wasn't doing God's work.
Loney: As you bring up, the First Amendment plays a very important role in this process.
Clemons: It does. The First Amendment is a very good thing. The First Amendment is based on the idea that truth wins in the free market for ideas. Similarly, trial by jury isn't perfect. The idea is, both parties do their best. They hire the best attorneys. They don't lie, but they manipulate perception as best they can. And the idea is, in the end, the best idea wins. It doesn't work when technology allows highly processed news. It doesn't work when I can craft a story specifically for you to believe and then make sure you're the only one who sees it.
Loney: Are our traditional news outlets impacted negatively because of what I think is the mindset around bringing news information forward on social media in this day and age? This is something that is really geared for where we are right now.
Clemons: The countries that have the strictest rules on the use of information, nations in the EU, would be the nations that you would expect to be least affected by highly processed news. Four or five years ago, I met with the head of strategy for a traditional Danish broadcaster. She showed that her market share in almost all areas had been reduced by about 75%. And she said her stories have to be true. They have to be the same for everyone. As soon as she starts varying what you see based on what you want to believe, she loses her license to broadcast. She pointed out that social media is much more entertaining. There's been some research that shows that when we watch TV, we want to be entertained, right? And we view social media as entertainment.
There's a work by two famous Nobel laureate behavioral economists, Kahneman and Tversky. Kahneman pointed out that we have type one and type two thinking, fast and slow. Fast is, “Something's wrong. I have to get out of here.” Slow is complex and analytical. The fake news researcher said we watch fake news with fast thinking. Is it entertaining? We don't say, “Is it correct?” It's one of the reasons why labeling demonstrably false news as false and originating in Russia, China or North Korea has very little impact on people's reposting it, because it's fun. So, fun trumps real media. I don't want to use the word “trump” in that context. Fun beats traditional media on reposting. It beats traditional media on market share. It's just compelling. It wins.
The Best Way to Fight Disinformation on Social Media
Loney: But then I think a lot of people would also say that you have to look at the importance of what social media has become, and that it's in our culture, and that it's going to be around our culture. How do you deal with the level of disinformation that we have there, but still allow social media platforms to be fun for people who would like them to be fun?
Clemons: I have an example of a social media website that is so trivial that it was able to police itself, until it was bought by Anheuser Busch. It was a craft beer rating site. I could say, “Who's got the profile most like mine?” And it turned out to be a guy named Rat Man. And I would say, “What did Rat Man think of this beer?” And he would say, “Even for me, too bitter.” Well, all right. If he didn't like it, it's certainly too much of an IPA for me.
There was one beer introduced by a brewery, and they obviously told their employees to love it. And suddenly, there were thousands of positive reviews for a beer, all coming from the same location. The location of the brewery. I had it pulled, because we could police it. It really didn't matter.
How do you police social media? Well, the first thing you do is you prohibit the social media from sharing aggregate statistics. This is one of the reasons why TikTok’s claim to be safe is patently false. TikTok doesn't have to move any data on any individual out of the US. It simply needs to categorize individuals into buckets of what they believe. And that data doesn't have to leave the States. China can craft stories and send them to TikTok. TikTok can post them and direct them to the people who want to read them. And none of this violates privacy law.
If you want to limit the effectiveness of highly processed fake news, you take away the processing factory. You don't let Facebook share data, even aggregate data, on people's beliefs. You don't let Facebook post stories to individuals who are responsive to them. That's hard to do, because aggregate data doesn't violate privacy law. And Facebook has successfully argued in front of the EU that sending you a story that you love is part of their commitment to be a good entertainment company. It's not easy to do this. But if you take away the raw material, you shut down the processing factory, then social media becomes just a blackboard.
Loney: As we're moving forward, the concept of disinformation is something that we will probably see stay around for the foreseeable future.
Clemons: If you remembered 1984, the book was originally titled 1948. And the expectation was technology would enable this degree of societal manipulation very quickly. It didn't, but we're getting there. Social media will be around forever. Lying has been a tool of statecraft and a tool of warcraft forever. Absolutely forever. The difference is between traditional one-off lying, which is highly effective only when a society is falling apart, and highly targeted lying, which works most of the time.
So, we've got a society where there is a strong motivation to win. There are people who truly believe they're doing God's work. And they have the tools to manipulate. What we have to do if we want to limit it without limiting free speech is deprive the tools of the data needed to process, create, and direct the news.



