After the Whistle Blows

silicon valley whistleblowers
After the Whistle BlowsDeirdre Lewis
  • Oops!
    Something went wrong.
    Please try again later.
  • Oops!
    Something went wrong.
    Please try again later.
anika collier navaroli photographed by deirdre lewis
Anika Collier Navaroli. Gilet, shirt, and pants, BRUNELLO CUCINELLI. Top and bottom rings, DINH VAN. Bangle, NOUVEL HERITAGE. Eyewear, necklace, and middle ring, her own.Deirdre Lewis


One day when Anika Collier Navaroli was in middle school, her mother took her on a trip to the supermarket in their Florida hometown. On their way, a man driving a pickup truck with a Confederate flag on the back veered onto the sidewalk and yelled racial slurs at them. Navaroli and her mother quickly ran inside a store, where they called the police and waited for help to arrive—while the truck sat parked outside. “This person had parked their car, gone on shopping,” Navaroli recalls, still mystified. “The cops basically told my mother and I that there was nothing they could do because there was no crime that had been committed and that this person had the First Amendment right to say and do what they had done,” she explains. But the experience stuck with her. “It really started something in my brain, trying to understand the interpretation of the First Amendment and free speech that could allow or condone or make room for these sorts of experiences that were incredibly violent and incredibly dangerous.”

It’s a question Navaroli would dedicate her career to exploring. After graduating from the University of Florida, she earned her law degree at the University of North Carolina and studied journalism at Columbia University, where she wrote her master’s thesis, “The Revolution Will Be Tweeted,” on the role of social media in movements like Occupy Wall Street. Her interest in free speech and technology eventually led her in 2019 to Twitter, where she was hired to help improve the company’s content moderation and conduct policies. But in the months leading up to the 2020 U.S. election, that work became increasingly complex to navigate.

While testifying last year for the House Select Committee investigating the January 6, 2021, attack on the U.S. Capitol, Navaroli described a period at Twitter in late 2020 and early 2021 when hate speech and incitements to violence on the platform were on a precipitous rise. Navaroli told the congressional committee that she and her team voiced concerns about the relationship between then-president Donald Trump’s incendiary rhetoric during his campaign against former vice president Joe Biden—which later included claims that the election was “stolen” from him after Biden was deemed the winner—and increasing calls for uprising on the platform. Navaroli pointed to the September 2020 presidential debate, when Trump appeared to be speaking directly to members of far-right extremist organizations like the Proud Boys, telling them to “stand back and stand by,” as an inflection point of sorts. “The floodgates kind of opened at that moment,” she said. On the platform, she testified, there was “a very specific shift in tenor from ‘Well, maybe we should do a civil war’ to ‘We’re definitely going to do [a] civil war, and we’re looking for a time, place, and manner.’ ”

Navaroli said she and her team pushed for stronger policies for content moderation—among them, arguing for the removal of individual tweets and a more nuanced policy surrounding what was referred to as “coded incitement to violence” or “dog whistles.” But executives at Twitter, she said, were hesitant to act—in part, she testified, because they “relished in the knowledge that they were also the favorite and most-used service of the former president and enjoyed having that sort of power within the social-media ecosystem.”

“If we made no intervention into what I saw occurring, people were going to die,” Navaroli told the committee. “And on January 5, I realized no intervention was coming.”

Trump remained on Twitter, tweeting through the January 6 attack, during which four people lost their lives and close to 140 police officers were injured. That evening, Twitter locked Trump’s account, then unlocked it the next day before “permanently” suspending it on January 8. (The suspension was lifted in November, following Elon Musk’s $44 billion purchase of Twitter last fall.) Navaroli, who by January 2021 was the most senior and tenured member of the safety policy team, left not long after.

Over the past two decades, big tech has come to influence virtually every aspect of our lives, from how we communicate to the ways we understand what’s happening in the world. The start-up culture that birthed behemoths like Twitter, Facebook, and Google is one built on promise: a belief in the power of ingenuity and creativity to democratize society and propel humanity toward some mythic techno utopia. The cult of the eccentric futurist disruptor-founder has turned tech entrepreneurs into celebrities and the move-fast-and-break-things ethos they espouse into a kind of seductive mantra.

The work done in the name of “disruption,” though, can have hidden consequences. In a 2016 TED talk, Uber cofounder (and now former CEO) Travis Kalanick lamented how, in 1914, a group of trolley owners in Los Angeles pushed for regulations to block a local car salesman from undercutting their business by offering cheap rides to people waiting in line. The trolley owners, Kalanick argued, were stifling innovation. “Imagine, without the regulations that happened, if that thing could just keep going,” he mused. “But technology has given us another opportunity.” A trove of documents leaked to The Guardian last year revealed that between 2013 and 2017, Uber, amid its own global expansion, flouted laws, lobbied politicians, and took advantage of gig-economy workers. (“There has been no shortage of reporting on Uber’s mistakes prior to 2017,” began a statement issued in response to the leak by Uber, now under the leadership of Dara Khosrowshahi, who had succeeded Kalanick as CEO that year. “We have not and will not make excuses for past behavior that is clearly not in line with our present values.”)

Increased regulation around digital privacy, data, and the way tech companies operate in the marketplace, many advocates say, could help rein in an industry where ambition can come at the expense of safety. But for an industry that prides itself on challenging the status quo, there doesn’t seem to be much room for dissent. Discussions around issues like responsibility and equity—and the people who try to initiate or engage in them—are often viewed as impediments to the pursuit of wealth and growth.

ellen pao photographed by deirdre lewis
Ellen Pao. Sweater, NILI LOTAN. Pants, PETER DO. Necklace and bracelet, DINH VAN.Deirdre Lewis

Ellen Pao, who sued her employer, Silicon Valley venture-capital firm Kleiner Perkins, in 2012 for gender discrimination, observed in her 2017 memoir, Reset, that tech’s exclusionism is rooted in the homogeneity of the decision makers. “For decades now, the venture industry has been dominated by white men who invest in white men, who are successful and reinforce this idea that it’s this very specific set of people who are great entrepreneurs and who will make money for your companies,” she wrote.

Although Pao lost her suit, the publicity surrounding it captured the attention of women across America who recognized their own experiences of being sidelined, ignored, or forced out of male-dominated work environments. The case seemed to ignite a wave of promotions at venture-capital firms and inspire more women in white-collar industries to call out the discrimination they’ve confronted. It’s been dubbed “the Pao effect.”

The momentum has not been unfettered. In 2020, women led only 4.7 percent of Silicon Valley’s top 150 companies by revenue, according to a 2021 report by law firm Fenwick & West. A study by PitchBook last year showed that the amount of venture-capital money that goes to women-founded start-ups has actually been on the decline, down to just 2 percent in 2021. Yet women make up a disproportionate number of the whistleblowers who have come forward about the internal machinations and inequities in Silicon Valley and at big tech companies.

The number is indicative of their status within the industry; women and people of color are less likely to be promoted to management positions and are often denied the power and resources they need to effect real change. And too often, when they are promoted to leadership positions, they are brought in to manage crises, which places higher expectations on them than on their white male counterparts—a phenomenon researchers at the University of Exeter in the U.K. called the “glass cliff.” Women and Latinx employees were also among those disproportionately impacted by the recent rounds of layoffs at companies like Amazon, Meta, Microsoft, and Twitter, according to an analysis by employment-information startup Revelio Labs of data from industry layoff tracker Layoffs.fyi and Parachute by Rocket.

In her initial testimony to Congress, Navaroli was identified only as “J. Smith.” But after a distorted recording was played during the January 6 hearings last July, the Associated Press misgendered her, referring to the Twitter whistleblower as “he.” Being misgen- dered offered Navaroli a modicum of security by protecting her anonymity, but it also did not allow for someone like her—a queer Black woman—to exist in the spaces she occupied. “It became very clear to me that there were deep assumptions about who a powerful tech whistleblower could be and that these assumptions did not include me and who I am and the identities that I hold—that only a straight white man could have been in the position I was in or have done the work that I did,” says Navaroli, who was called once again on February 8 to testify before Congress, this time alongside a trio of other former Twitter executives for the new Republican-led House Committee on Oversight and Accountability.

“This idea of being a disrupter is really glamorized or icon­icized within Silicon Valley,” says Navaroli, now a fellow at Stanford University’s Center on Philanthropy and Civil Society, where she is studying the experiences of Black content moderators in tech. “But for a lot of folks, being disruptive within the industry or at your job is very dangerous. It can lead to being fired, being pushed out, or your contribution being erased.”

Speaking out against a culture or an institution can come at great personal cost. “There is a huge mental toll,” says Pao, who continues to invest in the tech sector and now runs a nonprofit called Project Include, which is focused on increasing diversity. “There is a huge change in how you view the world. There’s a huge change in what your opportunities are and how people see you,” she explains. “You lose your privacy. Everybody has a prejudgment about you. … They’ve got these strong ideas that are hard to change—sometimes good, sometimes bad. But it’s just, like, you lose the chance to be a normal human being.”

In 2020, Ifeoma Ozoma came forward about the discrimination she said she encountered while working at Pinterest—which, The New York Times reported, included allegations of racism and sexism. After leaving Pinterest that year, she launched the Tech Worker Handbook, an online resource guide aimed at helping people in the industry make informed decisions about whether to bring forth allegations of misconduct. “I personally don’t believe that anyone should martyr themselves for any cause,” Ozoma says. “I just want folks to have their eyes wide open about what it’ll mean for them and their families when they speak up.”

Ozoma now serves as director of tech accountability at the UCLA Center on Race and Digital Justice and is the founder of Earthseed, a consultancy that advises individuals and companies on public policy and misinformation. She also helped draft and, through Earthseed, cosponsored California’s “Silenced No More” bill, which was signed into law by Governor Gavin Newsom in 2021 and enables workers to speak out about discrimination and harassment even if they’ve signed nondisclosure agreements. (A similar bill passed last year in Washington, where Amazon and Microsoft are headquartered.) Ozoma has led a push for tech firms to limit the scope of, or do away with, nondisclosure agreements. “Tech is ubiquitous; it’s in all of our lives,” she says. “But that’s also where the power lies for the companies and why abuses that would be seen as egregious anywhere sort of fly under the radar.”

frances haugen photographed by deirdre lewis
Frances Haugen. Turtleneck, AKRIS. Pants, MARINA RINALDI. Elsa Peretti necklace and bracelet, TIFFANY & CO.Deirdre Lewis

In recent years, tech firms have made more concerted efforts to reckon with their issues with greater transparency. But the people brought in to help address those concerns still often face an uphill battle.

Frances Haugen’s reason for wanting to work at Facebook was personal: In 2016, she saw a close friend become radicalized by conspiracy theories on the internet. It was painful for Haugen to watch parts of this person “disappear” because he was “getting fed such a stream of misinformation” from sites like 4chan and Reddit. She began to pay attention to how Facebook’s news feed served content to users, based not on the quality or accuracy of a story but whether or not a user engaged with it. It was an aspect of the platform’s algorithm that troll farms sought to exploit during the 2016 U.S. election cycle, feeding Facebook stories designed to stoke fear and outrage and spread misinformation.

Haugen, an algorithmic product specialist and a veteran of Google, Pinterest, and Yelp, saw room for improvement. “I understood the power of design choices, and I understood how few people in the industry work on those systems,” she says. In 2019, she accepted an offer to work as lead product manager on Facebook’s civic misinformation team and was up-front about her reasons for joining. “I joined Facebook because if I could keep one other person from feeling the pain that I felt when I lost my friend, it would have been worth it.”

But Haugen’s tenure at Facebook—now known as Meta—didn’t go as she had hoped. Haugen says she thought she was going to work on misinformation around the 2020 U.S. election, but she was assigned to analyze misinformation in territories all over the world. In countries like Ethiopia, Myanmar, and Sri Lanka, which were mired in civil strife and ethnic violence, abuse of the platform was rampant, and Facebook, she felt, wasn’t dedicating the resources to combat it effectively.

The tipping point for Haugen came in the spring of 2021, after her team was dispersed to other initiatives across the company. She resigned and later leaked thousands of internal documents to The Wall Street Journal. The Journal’s reporting on the materials revealed Facebook’s awareness of some of the detrimental effects that activity on its platforms, which include Instagram and WhatsApp, appeared to be having on users—from Instagram’s impact on the mental health of teenage girls to how Facebook was being used to spread hate and incite violence against vulnerable populations.

In a statement posted on Facebook’s corporate site, the company’s vice president of global affairs, Nick Clegg, said that the Journal’s analysis contained “deliberate mischaracterizations” of Facebook’s work and motives while acknowledging that the report touched on “some of the most difficult issues we grapple with as a company—from content moderation and vaccine misinformation to algorithmic distribution and the well-being of teens.” Clegg added: “These are serious and complex issues, and it is absolutely legitimate for us to be held to account for how we deal with them.”

Haugen likens what she saw among colleagues at Facebook to a kind of “moral injury,” a term coined by psychiatrist Jonathan Shay to characterize the “undoing of character” experienced by Vietnam War veterans. “I would see people with a sense of learned helplessness, like nothing could be done,” she says. “When you can watch that kind of phenomenon in the best place in the company for you to try to advocate for change … I didn’t really see a point.” Instead, she contacted Whistleblower Aid, a nonprofit that helps public- and private-sector employees disclose their concerns legally and safely. “I feel like I didn’t get to make much of a decision,” says Haugen, whose book about her experiences in tech and at Facebook, The Power of One, is set to be released in June. “The thing that I chose was to have a future, because having to live with that regret was going to wipe out my future.”

timnit gebru photographed by deirdre lewis
Timnit Gebru. Turtleneck, LORO PIANA. Skirt, STUDIO 189. Hoops and pendant, KHIRY.Deirdre Lewis

While whistleblowers have been lauded for their bravery, there remains a brutal irony: As their former employers plow ahead, those who speak out are often left to pick up the pieces. “It constantly puts people on the defensive, to clean up the harms—rather than giving people from different backgrounds the ability to innovate and forge our imagination of the future,” says scientist and AI ethicist Timnit Gebru. “It actually hinders innovation.”

In 2018, Google hired Gebru to study the social implications of artificial intelligence and offer solutions to make Google’s algorithms more fair and equitable. Researchers like Gebru have long cautioned about the ways AI can perpetuate or exacerbate existing bias. As companies and governments increasingly rely on AI to automate decision-making, these issues have real-world ramifications that can result in the surveillance and policing of communities of color and affect decisions over housing, insurance, health care, and more.

Prior to joining Google, Gebru cowrote a widely cited paper that identified bias in the way facial-recognition software recognized darker-skinned women. But in 2020, after a disagreement over the publication of a new paper she had coauthored, which examined the environmental impact of large-scale AI computing models and how racist and sexist language culled from the internet can become embedded in those systems, Gebru says she was fired. In a message to an internal employee group, she was also critical of Google’s response to efforts by her and her colleagues to advocate for women, people of color, and underrepresented communities in both their work and their workplace. Within days, more than 1,200 employees signed a letter in protest of her departure, which Google characterized as a resignation.

In a companywide email, Google CEO Sundar Pichai later pledged to “assess the circumstances that led up to Dr. Gebru’s departure, examining where we could have improved and led a more respectful process.” Pichai wrote: “[W]e need to accept responsibility for the fact that a prominent Black, female leader with immense talent left Google unhappily. This loss has had a ripple effect through some of our least represented communities, who saw themselves and some of their experiences reflected in Dr. Gebru’s.” But Google, which began reporting its own diversity data in 2014, has continued to lag in the hiring and retaining of women of color. According to the company’s 2022 report, women who identify as Black and Latinx accounted for just 2.3 percent and 2.4 percent, respectively, of the company’s U.S. workforce.

The episode brought to light some important things for Gebru. “It really clarified to me that I just cannot exist in these spaces,” she says. In 2021, Gebru founded the Distributed Artificial Intelligence Research Institute (DAIR), a diverse group of labor organizers, activists, and researchers exploring how the potential of AI can be harnessed more responsibly. “There isn’t one single path that is a predestined path in terms of technology. Multiple different paths are possible, and it’s always shaped by who has power and who has money and who is funneling resources,” Gebru says. “I just stopped wasting my time trying to convince these people to give us a seat at whatever table they have,” she adds. “I want to create my own table, on my own terms.”


For Navaroli and Pao, hair: Elise Bigley; makeup: Hether Beckrest; manicures: Rochelle Dingman; production: Annee Elliot Productions. For Gebru and Haugen, hair: Elizabeth Morache; makeup: Amy Strozzi; manicures: Nina Park; production: Jess Oldham.


THIS ARTICLE ORIGINALLY APPEARED IN THE OCTOBER 2022 ISSUE OF HARPER'S BAZAAR.

You Might Also Like