We are allowing social media platforms to self-regulate: Disinformation watchdog

Nina Jankowicz - Wilson Center Disinformation Fellow joins Yahoo Finance’s On The Move to discuss how disinformation is impacting Big Tech’s policies and the U.S. election.

Video Transcript

JULIE HYMAN: Welcome back. Twitter has changed its hacked materials policy. That's in the wake of the controversy over it and Facebook, for that matter, blocking links to a "New York Post" report that cited what were allegedly Hunter Biden's emails. But there are a lot of questions about that report.

Twitter says now it will no longer remove hacked content, unless directly shared by hackers. But this illustrates some of the challenges that the social media networks have when it comes to politically charged stories, especially those that are suspected of not being true.

Nina Jankowicz is joining us now. She has quite a lot of experience in this area as well. She's the disinformation fellow at the Wilson Center. She's joining us from Virginia. So Nina, when you look at what unfolded over the past few days, I mean, how do social networks get this right when they are trying to combat what they see as disinformation?

NINA JANKOWICZ: Yeah, it's a really difficult area, Julie. I think, you know, in this case, we've seen some-- basically a circumspect attitude from the media, not wanting to repeat the mistakes of 2016 when the hack and leak by the Russians of the DNC, and later, the Clinton campaign, really had a huge effect on the discourse surrounding the 2016 election.

And here we have materials of unknown provenance being shared, including the personal details of some of those involved, like Hunter Biden. And I think that's what inspired Twitter to take action here. They do have a policy that says you're not allowed to share personal information in hacked materials.

And generally, their policy was a little bit broader before. They said, you know, you're not allowed to share hacked materials at all. But reporters were allowed to report on it.

So it gets a little squiggly, basically. It's difficult. It's a difficult scenario. And without regulation and the absence of any government action on this, we're letting the platform self-regulate. And I think we're seeing them make some missteps here.

DAN ROBERTS: Yeah, Nina, Dan Roberts here. Let's stick to that idea that the platforms are self-regulating. I mean, regardless of the veracity of the "Post" story on Hunter Biden, this act by Twitter, you know, temporarily blocking anyone from even tweeting out the link, it was, to me, an interesting step.

Because the whole conversation, for years already, you know, before just the sort of fraught Trump era has been, are these websites just open platforms, and they can say, we don't, you know-- arbiter of content. We don't do that. We're just a platform. Or now, are they becoming more media companies, media arbiters?

I mean, this, to me, regardless of your politics, in many ways, was Twitter deciding, OK, we are going to fact check. We are going to pick and choose what we allow. And once you get into that, I mean, it kind of changes the DNA of these whole companies.

NINA JANKOWICZ: Right, and that's one of the big misconceptions, I think, about how the platforms have been operating this entire time. Your Facebook news feed is not just a chronological news feed of the posts that you are subscribed to. Your Twitter feed also is weighted based on what they think is most interesting, what you're going to interact with.

So the platforms have been making those decisions, those editorial decisions, for a very long time. And the fact that this is playing out in the political arena is just kind of blowing that fact up to people. But that's been happening for a long time.

And the fact is, these are private platforms. They're allowed to make those decisions, because you sign up to their terms of service and sign up to those decisions that they're going to be making. So for people to cry, you know, First Amendment foul here is not exactly right. Because your First Amendment rights don't exist on private platforms.

ADAM SHAPIRO: But they function somewhat like newspapers, even though they're not. So why not just repeal 230, Section 230, and make them accountable? Perhaps let Hunter Biden sue them, if his videos were, indeed, hacked.

I don't believe he's commented. No matter how weak the "New York Post" story may be, he's not commented about this yet. So why not make them adhere to the same rules that a true journalist outlet would have to adhere to?

NINA JANKOWICZ: Well, there's a problem of scale, first of all, just the fact that there are billions of posts occurring on these platforms every day. Most legal analysts-- and to be clear, I'm not a lawyer. But they say repealing 230 would actually be a detriment to freedom of speech. It would allow the platforms to take action at a massive scale against all sorts of posts, just to kind of save their own butts. And that's not what we want.

We've seen a similar situation in Germany, where they have to remove content that runs afoul of German hate speech laws within 24 hours. And what the result has been is the platforms over removing content. That would be a much more suppressing outcome for freedom of speech than what's going on now.

So I think we perhaps need to reexamine 230 in light of what's been going on. But I think repealing it whole cloth would be a problem.

RICK NEWMAN: Hey, Nina, Rick Newman here. Even if Twitter and Facebook are stumbling along, they're definitely doing a lot more than they did in 2016, which was basically nothing. So how much more should they be doing? And what is the solution here?

NINA JANKOWICZ: Well, there is no one solution. I know that's a bit of a non-answer, but we are missing a regulatory landscape at all. Congress has abdicated its duty to even think about these issues because they have become so politicized.

The platforms are making perhaps not a fully good faith effort. A lot of this is about PR wins for them and saying that they're taking down content from foreign interference when, you know, we know that there is more out there. But they are making an effort. They're cooperating with law enforcement.

I think one area that we really have not looked at enough is the fissures in our society that make us so vulnerable to disinformation in the first place. So why are we so able to be manipulated? Is it our media literacy skills? Yes, people really do need better tools for how to navigate this information environment with so much coming at them.

But it's also the issues that allow foreign interference to exist. Endemic racism, economic inequality, the hot button issues that have become the hallmarks of every political campaign. We have to solve those.

Because even though it might be Russia today or China, Iran, we've got domestic disinformers and all sorts of foreign adversaries who are going to try to copy those tactics in the years to come. And until we address those internal problems, we're going to remain vulnerable to all sorts of interference and disinformation.

JULIE HYMAN: This problem is definitely not going away. Nina, thank you so much for your perspective. Nina Jankowicz is Wilson Center's Disinformation Fellow. Really appreciate your time today.