The Central App

The Facebook whistleblower's story

The Central App

RNZ

11 July 2023, 5:46 PM

The Facebook whistleblower's storyFormer Facebook employee turned whistleblower, Frances Haugen. Photo: ROY ROCHLIN/AFP

Frances Haugen felt she had no choice but to blow the whistle over Facebook's lack of action to improve the safety of its platform.


Haugen worked as a product manager on the civic integrity team at Facebook for two years, before smuggling about 22,000 internal documents from the tech giant's Silicon Valley headquarters.


These documents revealed the social network knew from its own research that it was damaging teenage mental health, amplifying hate and ethnic violence, and undermining democracy.


In 2021, Haugen shared the documents with United States authorities and the Wall Street Journal.



She tells her personal story and continues her call for big tech to improve its transparency and accountability in her new book, The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook.


Haugen's decision to gather sensitive information and release it publicly was driven by a desire to be able to sleep at night, she told Afternoons.


"I felt in many ways like Facebook had taken my future from me, that I didn't have a choice whether to act or not, because if I didn't act, I would still pay the costs of it in the future," she said.



The definitive moment that spurred her to speak out was when the civic integrity group she worked for was dissolved by Facebook after the 2020 US national election, she said.


"It became obvious to me Facebook was not willing to commit the resources in an ongoing way to be the change that was needed."


The 2019 terror attacks on Christchurch mosques were a wake-up call for Facebook, she said. The gunman livestreamed his attacks on Facebook, but the platform didn't remove the videos until New Zealand police asked it to, she said.


"It made them aware they had created a very powerful tool and a very seductive tool that they had minimal control over."



Facebook had plenty of options available to restrict misinformation, but often chose not to implement them, Haugen said.


"If you cut re-share chains when they get beyond a friend of a friend, that has the same impact on misinformation as the entire third party fact checking programme, only it works in every language and it's not about picking and choosing good and bad ideas."


Facebook had not encouraged as much research on its platform as was needed, she said.


"How many hours a day is it safe to use?


"The reason why we don't have definitive answers on what amount is safe is because Facebook doesn't want the public to have those answers."


Haugen said it was particularly important that children aged under 13 were not using Facebook, because they were at the highest risk of negative impacts.


"Kids should charge their phones in their parents' bedrooms at night, because one of the most dangerous impacts of these platforms is sleep quality.


"A kid staying up til two, they're going to do worse in school and they're going to be at higher risk for a bunch of harms... everything from depression and anxiety to things like bipolar and schizophrenia... substance abuse... to accidents."


Haugen said she knew Facebook had weaknesses, but she became more alarmed about them the more she dug into the company's files.


"I knew something was wrong when I started. By the time I got a good chuck through documenting, I thought "this is so much worse than what I'd thought".


"That period of time was super scary for me because I knew there was only going to be one chance at being able to let the public know.


"If Facebook discovered what I was doing before I was done, no-one was ever going to get to try again.


"That sense of responsibility was really, really hard, because imagine if you think about the consequences that could be as serious as ethnic violence - by that point, there had been two major ethnic violence instances that had been attributed to Facebook."


Haugen said Facebook's impact was worse in non-English speaking countries.


"Facebook is much, much cleaner and much, much safer for English than it is in any other language in the world.


"They spend much, much, much more on safety for English, even though only about eight or nine percent of users speak English.


"Facebook is basically the internet for a couple of billion people in the world. They went into the most fragile places in the world and said, "If you use our products, your data is free".


"People don't have independent shops - they use Marketplace. They don't have websites - they use pages. There are not necessarily newspapers - there are just mega-groups.


"All of these vulnerabilities of Facebook are amplified because there aren't any checks or balances," Haugen said.


Although she grew up learning to keep her head down, she felt "liberated" by the experience of feeling forced to speak out in order to live with her conscience.


"I really want to help people reclaim the idea that we do have power and the way the world changes is that we change it," she said.