Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Trump, Meta and misinformation

MICHEL MARTIN, HOST:

Facebook, the world's largest social network, is about to bring back one high-profile account to the ranks of its nearly 3 billion users - that of former President Donald Trump. Then-President Trump was suspended from Facebook and Instagram after the mob attack on the Capitol on January 6 for praising the violence that congressional investigators say he helped instigate. But that suspension came with an asterisk. Meta, Facebook's parent company, would reevaluate the suspension in two years. Now that Meta has decided to allow the former president back onto its platforms, it is also rolling out a new policy for those it is designating as public figures. That's defined as government officials, political candidates and people with over 1 million followers. But even with these new regulations, many fear Facebook has not made enough changes to tackle the spread of falsehoods.

We wanted to talk more about the possibility of Trump returning to Facebook and the platform's ability to police misinformation. For that, we called Vivian Schiller. She's the executive director of Aspen Digital, a part of the Aspen Institute that's working to empower people to be responsible stewards of technology and media. She's the former global chair of news at Twitter and, we also want to mention, the former president and CEO of NPR. And she's with us once again. Vivian Schiller, welcome back. Thanks so much for joining us.

VIVIAN SCHILLER: Thanks, Michel. Nice to be on the air with you.

MARTIN: In Meta's statement on reinstating Trump, they said in regard to public safety, quote, "our determination is that the risk has sufficiently receded." It goes on from there. But what is the metric for that? I think that's one of the questions. What is the metric for determining something like that? Because one can see where that wouldn't just be relevant to Trump but to other public figures with a big megaphone.

SCHILLER: Right. And that's why it gets very complicated. Regardless of how you feel about Donald Trump and his call for - calls for insurrection, which are, you know, pretty clear, it was two years ago. He is no longer in office. He no longer wields the authority to - in the same way that he did. And so I think as a - you know, as a major platform, there - they needed to put in place an opportunity for someone to - you know, and I'm using air quotes here - "to redeem themselves," you know, and by holding them very closely accountable for the kinds of posts that they put forward later. And the key thing here is they have to apply these rules consistently, and they have to apply these rules religiously, even to Donald Trump. It's almost certain that Trump is going to violate some of those rules pretty quickly. So I think the really interesting part is going to be what happens when he does.

MARTIN: So the new policy for public figures addresses civil unrest. But what about misinformation and disinformation that doesn't fall under that category? And I want to point out here that disinformation and misinformation can be prevalent on Facebook. And when the former president was on the platform, for example, he posted numerous falsehoods, ranging from misleading information about COVID-19 to, of course, again, falsely claiming that the 2020 election was stolen. So are there policies to address that?

SCHILLER: Yeah. So, I mean, first of all, it's important to distinguish just mis- and disinformation and mis- and disinformation that could arguably lead to immediate public harm. I mean, it is not against the law to be wrong on the internet, and that includes for the president of the United States. He can lie all he wants. We may find it heinous, but I would argue he should not be deplatformed for putting incorrect information out there. Where that changes is if there is direct and immediate threat, such as, you know, let's go storm the Capitol. Please join me on January 6. Or mis- and disinformation that - around the pandemic that could lead to, you know, terrible negative health repercussions for people.

So it's very, very tricky. They have put out quite explicit rules about what kind of content is not permissible. And they've even designated that some of that content, if they find objectionable, will be down-ranked so it's not easily seen or that he can't use it for fundraising, can't make money off of it. Other kinds of content would cause him to be suspended again, either, you know, for another period of, you know, one or two years or potentially permanently.

MARTIN: I'm going to go back to something you said earlier about how it's not against the law to be wrong on Facebook or any other platform, for that matter. In 2021, a study that analyzed user behavior on Facebook around the 2020 election found that publishers who trafficked misinformation got six times the amount of likes, shares and interactions than traditional news sources. Why is that?

SCHILLER: Because those that would seek to sow chaos are also very good at pushing our buttons as human beings. I don't know how we solve for that. Content that makes us angry, content that makes us feel like it justifies our worldview and anything that is sort of provocative or outlandish is always going to attract more attention than sort of sober facts. And, I mean, this is a - this is - and on top of that, the platforms, such as Facebook, their entire business model is about amplifying that content to get us to stick around longer. So this is a systemic issue that goes well beyond, you know, Donald Trump on Facebook and any Facebook posts he may or may not post. It's - you know, it's a much broader issue that I and many others are trying to figure out what is a better way for our civic information ecosystem because this ain't it.

MARTIN: With another presidential election upon us - I mean, just about - I mean, we're at the midway point, but, you know, President Trump has already declared his candidacy. Other people are obviously, you know, sort of making their decisions right now. Well, you've already said that this isn't something that has been easily correctable, that you can't really easily correct for. I just - I'm just - what should we be thinking about here?

SCHILLER: Really, the best way forward is to design new alternatives. And there's a lot of work being done in this space to design new alternatives to online information systems and also, critically, to restore and improve news, particularly at the local level, in this country. There is a massive news deficit. It's been decimated. And into that vacuum flows falsehoods and Facebook groups and Donald Trump's posts and all of that. So that's where we need to pay our attention.

MARTIN: That's Vivian Schiller. She's executive director of Aspen Digital. That's a part of the Aspen Institute. Vivian Schiller, thanks so much for talking to us once again.

SCHILLER: Glad to be with you. Transcript provided by NPR, Copyright NPR.