“The 360” shows you diverse perspectives on the day’s top stories and debates.
Facebook and Twitter last week each took steps to limit the spread of a questionably sourced New York Post story that allegedly contained content from a computer belonging to Joe Biden’s son Hunter.
Twitter temporarily prevented users from tweeting links to the story because, the company said, it violated its policies against sharing hacked materials and personal information like email addresses. Facebook suppressed distribution of the story while it conducted an independent fact check.
The moves sparked outcry from a number of top Republicans, including the president, who accused the social media firms of meddling in the election by stifling news that could prove harmful to Biden. The incident also reignited calls from some GOP lawmakers to roll back or repeal Section 230, a portion of the Communications Decency Act that gives companies like Twitter and Facebook the right to moderate content on their platforms and frees them from legal liability for what their users post.
The incident is the latest disagreement in an ongoing tug-of-war between conservatives and Big Tech. In recent months, Facebook and Twitter have taken increasingly aggressive steps to combat misinformation, including taking action against posts by Trump that contained misleading information about the coronavirus and mail-in ballots. These new policies regularly spark accusations of anti-conservative bias. Many Democrats, on the other hand, say social media firms give Trump and other GOP lawmakers far too much leeway to post misinformation.
Why there’s debate
Under current law, Facebook and Twitter have the legal right to edit, modify or even delete any content posted to their platforms. Whether they should do that is a matter of heated debate, and the New York Post story presented a particularly thorny challenge, experts say. On one hand, it came from a news organization that presumably has higher standards of proof than the average user. On the other, the information in the story is dubious at best and came out of an editorial process that reportedly raised doubts among the Post’s own staff.
Those who support the decision to suppress the story say major social media firms have an obligation to stamp out misinformation that could sway the presidential election, something they have been accused of failing to do in 2016. Steps like suppressing the Post’s story, along with efforts to root out conspiracy theories like QAnon, are long overdue, they argue.
Conservative critics say tech companies practice selective enforcement of their moderation policies that often singles out right-leaning users while giving similar behavior from Democrats a pass. The public should have the opportunity to judge news stories on the merits rather than have social media companies decide what they should see, they argue.
Others are sympathetic to the need to fight misinformation, but don’t believe that the Big Tech firms — which often use opaque and constantly changing moderation standards — are well equipped to handle that task. Another group fears that too much government intervention in social media company policies risks putting lawmakers in the position of policing free speech.
The CEOs of Facebook, Twitter and Google’s parent company, Alphabet, are scheduled to attend a hearing next week of the Republican-led Senate Commerce Committee on possible changes to Section 230. Members of the Senate Judiciary Committee have said they are considering a similar hearing in the near future.
Facebook and Twitter have an obligation to prevent election interference on their platforms
“It’s clear that what Facebook and Twitter were actually trying to prevent was not free expression, but a bad actor using their services as a conduit for a damaging cyberattack or misinformation.” — Kevin Roose, New York Times
Political pressure shouldn’t affect how social media companies apply their policies
“Platforms should have rules against distributing hacked material and misinformation, and there’s no good argument for enforcing or not enforcing them according to how much it might affect a campaign. That’s a judgment call the platforms are completely unqualified to make.” — Jon Healey, Los Angeles Times
Complaints about anti-conservative bias are unfounded
“Conservatives do sometimes see their posts or tweets labeled as suspect or taken down. … But this illustrates major platforms actually doing their job of weeding out disinformation bearing on a public health crisis. There is no evidence to speak of that shows a systemic campaign to squelch conservative views online.” — Paul M. Barrett, The Hill
Social media companies shouldn’t repeat their mistakes from 2016
“It would be the height of irresponsibility to simply broadcast these dubious allegations without first verifying their veracity and provenance. That’s what happened in 2016 when social media sites such as Facebook and Twitter became conduits for a massive Russian attack on the U.S. election. … It is entirely understandable and proper that Facebook and Twitter should exercise some caution this time around.” — Max Boot, Washington Post
They should be commended for doing the right thing when they knew it would cause a backlash
“It’s not going to get easier for Facebook or Twitter. They should expect more incidents as we approach the November election. But the two companies are showing they are willing to make tough, real-time decisions to protect our democracy and the electoral process. This is progress.” — Tae Kim, Bloomberg
Overaggressive censorship is a threat to free speech
“We stand on dangerous ground when we allow governments to intervene to ‘protect’ us from bad and dangerous words and thought. The only thing worse would be to encourage social media companies to do the same.” — Jack Shafer, Politico
Social media companies have an obligation to be neutral
“What Silicon Valley has failed to grasp in the past few years is that it is better to be neutral in the political space than to favor one side over the other. By favoring left-leaning individuals and causes, Big Tech has created their own problems.” — Adam Brandon, Washington Examiner
Big Tech firms shouldn’t let politics guide their decisions
“For years, it has been obvious that social-media companies simply react and respond to the moral panics happening at other media companies. They are terrified of being blamed or, in Facebook’s case, blamed again for Donald Trump.” — Michael Brendan Dougherty, National Review
News organizations should get more leeway than regular users
“The platforms are right to be on guard for foreign election meddling. … But when a real American news outlet, albeit less than perfect, publishes a questionable story, the platforms are far better off flagging it as suspect rather than sending it down a memory hole. This isn’t government censorship, but it’s censorship, and it only feeds suspicion, corrosive distrust and legitimate complaints about double standards.” — Editorial, New York Daily News
Tech firms aren’t equipped to act as fact checkers
“Traditionally, reporting mistakes or bad sources are exposed by other journalists, subject matter experts, or sources with firsthand knowledge. Social media companies have none of these prerequisites, and moderation doesn’t offer new information to help readers make up their minds, it just suppresses the original story. This fundamentally short-circuits the normal journalistic process.” — Adi Robertson, Verge
If they can’t be more transparent, social media companies shouldn’t censor news content
“Transparency is essential to the fact-checking community and to the cause of reducing mis and disinformation. The decision to reduce or prevent the distribution of the New York Post’s article based on some mysterious, non-transparent criteria and an unknown methodology is a serious mistake.” — Cristina Tardáguila, Poynter
Is there a topic you’d like to see covered in “The 360”? Send your suggestions to firstname.lastname@example.org.
Read more “360”s
Photo illustration: Yahoo News; photos: Getty Images