In an explosive hearing in July, an unidentified former Twitter employee testified to the House Jan. 6 committee that the company had tolerated false and rule-breaking tweets from Donald Trump for years because executives knew their service was his "favorite and most-used . . . and enjoyed having that sort of power."
Now, in an exclusive interview with The Washington Post, the whistleblower, Anika Collier Navaroli, reveals the terror she felt about coming forward and how eventually that fear was overcome by her worry that extremism and political disinformation on social media pose an "imminent threat not just to American democracy, but to the societal fabric of our planet."
"I realize that by being who I am and doing what I'm doing, I'm opening myself and my family to extreme risk," Navaroli said. "It's terrifying. This has been one of the most isolating times of my life."
"I wouldn't be doing this if I didn't believe the truth matters," she said.
Twitter banned Trump two days after the Jan. 6, 2021, attack on the U.S. Capitol, citing fears he could incite further violence. By that time, he had sent more than 56,000 tweets over 12 years, many of which included lies and baseless accusations about election fraud. One month earlier, he had tweeted, "Big protest in D.C. on January 6th. Be there, will be wild!"
Navaroli, a former policy official on the team designing Twitter's content-moderation rules, testified to the committee that the ban came only after months of her calls for stronger action against Trump's account being rebuffed. Only after the Capitol riot, which left five dead and hundreds injured, did Twitter move to close his 88 million follower account.
Tech companies traditionally require employees to sign broad nondisclosure agreements that restrict them from speaking about their work. Navaroli was not able to speak in detail about her time at Twitter, said her attorney, Alexis Ronickher, with the Washington law firm Katz Banks Kumin, who joined in the interview.
But Navaroli told The Post that she has sat for multiple interviews with congressional investigators to candidly discuss the company's actions. A comprehensive report that could include full transcripts of her revelations is expected to be released this year.
"There's a lot still left to say," she said.
Navaroli is the most prominent Twitter insider known to have challenged the tech giant's conduct toward Trump in the years before the Capitol riot. Now in her 30s and living in California, she worries that speaking up about her role in pushing for Trump's removal could lead to threats or real-world harm.
A committee member, Rep. Jamie B. Raskin (D-Md.), cited those concerns to explain why Navaroli's voice had been distorted to protect her identity in the segment of her testimony played during a nationally televised hearing in July. Raskin unveiled her name in a tweet Thursday, thanking her for her "courageous testimony" and "for answering the call of the Committee and your country."
"She has constantly had to say to herself: This is important for the world to know, but it can compromise my safety. And she continually makes the patriotic choice," Ronickher said. "The folks who do come forward and are willing to take these risks make such an impact for the rest of us."
The hearings, which have been watched by millions, are expected to resume next week. The committee's chairman, Rep. Bennie G. Thompson (D-Miss.), said Tuesday that the hearing could feature "significant witness testimony that we haven't used in other hearings."
Twitter for years dismissed calls to suspend Trump's account for posts that many people argued broke its rules against deceptive claims and harassment; as a political leader, Twitter executives argued, Trump's tweets were too newsworthy to remove.
But if Trump had been "any other user on Twitter," Navaroli told the committee, "he would have been permanently suspended a very long time ago."
The banning has helped fuel a conflict over tech companies' rules that is likely to be settled in the Supreme Court. More than 100 bills have been proposed in state legislatures that would regulate social media platforms' content moderation policies, and on Wednesday, Florida asked the Supreme Court to determine whether the First Amendment prevents states from doing so.
Twitter executives have argued that Navaroli's testimony leaves out the "unprecedented steps" the company took to respond to threats during the 2020 election. The company said it worked to limit the reach of violent extremist groups and ban accounts from organizers of the Capitol riots.
The company is "clear-eyed about our role in the broader information ecosystem," Jessica Herrera-Flanigan, Twitter's vice president of public policy for the Americas, said in a statement in July.
In the interview with The Post, Navaroli, who is Black, said she still remembers the first time she thought about the constant conflict between Americans' rights of safety and free expression. She was a middle-school student, walking with her mother to a Publix grocery store near their home in Florida, when a man swerved his truck onto the sidewalk toward them, shouting racial slurs and demanding they go back to where they came from.
After the police arrived, she said, the officers refused to file charges, saying that no one had been hit and that his speech had been protected by the First Amendment.
"It was the first time I was understanding my identity could cause someone to . . . try to murder me," Navaroli said. "And I was being told this man that tried to kill me did nothing wrong because this was his constitutional right. It didn't make sense. So for a lot of my career and a lot of my life, I have been trying to understand this interpretation of this amendment and this right in a way that makes sense."
In high school, she said, she became fascinated by constitutional questions in her debate class, which simulated mock congressional hearings - one of which took her, for the first time, to Washington, where years later she would sit and give congressional testimony.
In the years afterward, she graduated from the University of North Carolina's law school and got her master's degree at Columbia University, where in 2013 she wrote a thesis titled "The Revolution will be Tweeted" on how constitutional legal principles had expanded to social media.
She later helped study issues of race and fairness with a technology research group in New York, worked on media and internet privacy campaigns for the civil rights advocacy group Color of Change, and taught basic principles of constitutional law to high school students in Harlem.
As the power and prominence of social media expanded during those years, she said she grew fascinated with how online content moderation rules were helping shape real-world social movements, from the inequality campaigns of Occupy Wall Street to the protests over racial justice and police brutality.
She had a strong bias for protecting speech, she said, but she often questioned where some companies were drawing the lines around speech and privacy and what effect that could have on people's lives.
"Regulating speech is hard, and we need to come in with more nuanced ideas and proposals. There's got to be a balance of free expression and safety," she said. "But we also have to ask: Whose speech are we protecting at the expense of whose safety? And whose safety are we protecting at the expense of whose speech?"
By 2020, Navaroli was working on a Twitter policy team helping the company design rules for one of the internet's most prominent gathering places for news and political debate, according to congressional testimony revealed this summer.
By then, Trump had become Twitter's inescapable force, capturing global attention and news cycles with a constant stream of self-congratulatory boasts and angry tirades.
Starting in 2011, he used the site as a major propellent for the racist "birther" claim that former president Barack Obama was born in Kenya. In one 2014 tweet, Trump asked cybercriminals to "please hack Obama's college records (destroyed?) and check 'place of birth.' "
During the 2016 campaign, his jotted-off insults helped undermine his critics and sink his political rivals as he captured the Republican nomination and then the presidency. And once in the White House, his tweets became a constant source of surprise and anxiety for even his own administration.
He used Twitter to fire people and belittle America's geopolitical antagonists, including tweeting in 2018 to North Korean leader Kim Jong Un that "I too have a Nuclear Button." He also used it to announce sweeping executive actions, including his (failed) push to ban transgender people from the military. "Major policy announcements should not be made via Twitter," Sen. John McCain (R-Ariz.) said then.
Navaroli had argued that Twitter was acting too reluctantly to hold Trump to the same rules as everyone else and, by 2020, she had begun to worry that the company's failure to act could lead to violent ends, she told congressional investigators.
After Trump told the Proud Boys, a far-right group with a history of violence, at a September 2020 presidential debate to "stand back and stand by," Navaroli pushed for the company to adopt a stricter policy around calls to incitement.
Trump "was speaking directly to extremist organizations and giving them directives," she told the committee. "We had not seen that sort of direct communication before, and that concerned me."
She had also seen how his tweets were quickly sparking replies from other accounts calling for "civil war." After Trump's "will be wild" tweet in December, she said, "it became clear not only were these individuals ready and willing, but the leader of their cause was asking them to join him in . . . fighting for this cause in D.C. on January 6th."
The company, however, declined to take action, she told the committee. She pleaded with managers, she said, to face the "reality that . . . if we made no intervention into what I saw occurring, people were going to die."
On Jan. 5, 2021, as pro-Trump forums lit up with excitement about the coming day, she said she was deeply unnerved by the company's failure to take stronger action against messages from "a violent crowd that was locked and loaded," she told congressional investigators. She said she wrote that night in an internal Slack message, "When people are shooting each other tomorrow, I will try and rest in the knowledge that we tried."
On Jan. 6, Trump resisted calls for hours to calm the mob after it had stormed into the Capitol. At 2:24 p.m., Trump tweeted that his vice president, Mike Pence, whom members of the mob had been calling to be hanged, "didn't have the courage to do what should have been done."
At 2:38 p.m., hours after the riots had started, he acknowledged them for the first time, tweeting, "Stay peaceful!" Later that evening, following a brutal skirmish between rioters and the police, Trump tweeted, "These are the things and events that happen when a sacred landslide election victory is so unceremoniously & viciously stripped away from great patriots . . . Remember this day forever!"
Twitter suspended Trump's account that evening for 12 hours, but he continued tweeting the next day, even as some Twitter employees began receiving threats. Five people died on the day of the insurrection or in the immediate aftermath, and 140 police officers were assaulted.
On Jan. 8, Trump tweeted that the "great American Patriots who voted for me . . . will not be disrespected or treated unfairly in any way, shape or form!!!" In his final tweet, at 10:44 a.m., Trump said he would not be attending President Biden's inauguration.
Twitter's decision to "permanently suspend" Trump that day followed internal deliberations and emergency meetings. In a statement that evening, Twitter said his tweets could be used to "incite violence" and showed that he planned to "support, empower, and shield those who believe he won the election."
But in philosophical tweets after Trump's ban, Twitter's then-chief executive, Jack Dorsey, expressed some reservations about having to take Trump's megaphone away. These actions "fragment the public conversation," he wrote, and "limit the potential for clarification, redemption, and learning."
Navaroli said she is still broadly hopeful about the internet's "amazing" ability to connect people, but she worries companies are still struggling to "find the right interventions and levers" around online expression that won't "lead us to this dystopian future I see ahead."
"I've just really wanted to do my job well," she said. "This is what I do."
The Jan. 6 committee's announcement Thursday follows months of questions about her identity. Her name and details of her work have been fiercely guarded by the committee, which has said its work could lead to criminal referrals of Trump over his role in the attack.
Navaroli left Twitter last year and is now researching the impact of hate-speech moderation through a fellowship at Stanford University. She said she hopes the testimony she gave the committee will help inspire more Silicon Valley insiders to speak publicly about their companies' failures to fight viral misinformation and extremist speech.
"My fear within the American context is that we have seen our last peaceful transition of power," Navaroli said. But "the same playbook," she added, is being used around the world, "teeing up the idea that if an election is not in someone's favor, it's been rigged. Without intervention we really are on this path to catastrophe."