Canada has an “amazing opportunity” to lead a coalition of smaller countries in demanding accountability from Facebook, says former employee-turned-whistleblower Frances Haugen.
If you can get together 100 million, 200 million people — you know, countries — you will be able to force change from Facebook, said Haugen, who left her role as a product manager at the social media giant last May and disclosed thousands of internal company documents to the media.
Haugen has accused Facebook of putting profits before the well-being of its users — from failing to protect children and their mental health, to fuelling misinformation and inciting political violence. She’s also called for stricter government oversight to address these problems.
I have faith that Canada could be a leader in driving that change, she told The Current’s Matt Galloway.
Haugen said she disagrees with the idea that larger powers, like the U.S. and European Union, should lead the charge.
She pointed to the U.K., which last year brought in sweeping regulations (and potentially heavy fines) around how websites and apps interact with children online, and how their data can be used.
As a result of the new rules , TikTok stopped sending younger users notifications later in the evening and YouTube removed the autoplay function on videos for users aged 13-17. Facebook exempted users under 18 from some forms of targeted advertising, while the company’s Instagram platform made accounts for teen users private by default
Haugen said Facebook made the changes globally because it’s difficult to customize services across multiple countries.
coalition of countries led by Canada could call for similar changes, she said, through co-ordinated legislation that demands transparency and accountability from the company.
The reality is, Facebook has to live in democracy’s house, right? she said.
After Haugen went public in October, social media experts told CBC News that social platforms were unlikely to fix the problems on their own and needed government involvement.
The federal Liberal government tabled three bills around online protections during the last session of Parliament:
- Bill C-10, intended to update the Broadcasting Act to reflect modern media consumption , including streaming sites and user-generated content on social media platforms.
- Bill C-36, which proposed changes to curb online hate speech and give victims more resources.
- Bill C-11, which promised that Canadians will have greater control over their online data , with heavy penalties for companies that breach privacy.
All three bills died when Parliament was dissolved in August ahead of the federal election, though the Liberals have pledged to resurrect them.
The Current asked Canadian Heritage — which is co-leading federal efforts to regulate Internet giants — for comment, but did not receive a response by publication time.
‘Impasse’ over fixing problems
Haugen started working at Facebook in 2019, hoping to help solve problems around misinformation on the platform, but said she quickly noticed an
The problem is the people whose job is to find these problems, and the people whose job is to authorize fixing these problems, are different people, she said.
If fixing a problem doesn’t align with the incentives of those authorized to fix it — such as company growth — it didn’t get fixed, she said.
Part of Haugen’s work was on the civic integrity team, which she described as being tasked with making Facebook a
positive force in politics. But that team was dissolved a month after the 2020 U.S. presidential election, its staff moved to a broader safeguarding team that did not have a specific mandate on politics .
That’s when Haugen decided to go public with her concerns.
It showed a level of lack of commitment and, like, a blindness, that I was like, ‘This is just not acceptable.’ You can’t have a force that’s this dangerous that thinks itself as safe, she said.
Haugen took pictures of internal documents before she left, which became the basis of a series of Wall Street Journal exposés .
Facebook didn’t set out to ‘incentivize rage’
Among her allegations were that the company was aware that its Instagram platform could have a negative impact on the body image and mental health of its users, but it failed to take action — something that was of particular concern to U.S. lawmakers when Haugen testified before a Senate committee last October.
At the time, Facebook responded that
the story focuses on a limited set of findings and casts them in a negative light, but it stood by the research.
Haugen also alleged that an algorithm change in 2018 prioritized showing users content with more comments or shares, but that much of that engagement was negative, such as people arguing within comment threads.
Though the new algorithm brought more eyes to divisive content, she said, it also increased the amount of time users spent on the platform, which in turn increased revenue from digital ad sales.
Haugen told The Current she didn’t think the company set out to
incentivize rage, but it happened amid an overall drive to increase interaction on the site.
They didn’t spend enough on safety systems or on people watching for these problems. That was the real issue, she said.
In an email statement to The Current, a spokesperson for Meta, the parent company of Facebook, said the premise at the centre of Haugen’s claims is
Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie.
The statement further said Facebook has
over 40,000 people to do one job: keep people safe on our services.
I think we probably need a different leader to come in because he hasn’t demonstrated a willingness to change.- Frances Haugen on Meta CEO Mark Zuckerberg
Haugen said advocates have long raised concerns about Facebook’s operations and impact, but transparency has been a key problem.
When concerns are raised, she said, the company will often downplay external evidence as
anecdotal, without revealing their own investigations into problems or outlining what corrective action is taken.
She described a hypothetical scenario in which there are concerns about children being exposed to posts about self-harm. Through legislation, she said, Facebook could be compelled to track and report how many children are seeing that content and how often.
Imagine a world where that number is reported. Would Facebook get better about self-harm content? Almost certainly. So we have to change that dynamic, she said.
Zuckerberg hasn’t shown ‘willingness to change’
After Haugen’s testimony last October, CEO Mark Zuckerberg said the allegations mischaracterized Facebook’s work and priorities.
Later that month, Facebook Inc. rebranded to Meta , with Zuckerberg laying out a vision of a digital world where people can use avatars to play games together or attend virtual concerts.
To Haugen, this pivot to
video games and the metaverse shows Zuckerberg is still primarily interested in growing the company — something he has been richly rewarded for over the years — rather than addressing its problems.
She expressed empathy that
Mark is in a really hard place because he has to learn to be a slightly different leader now, but said that’s something she’s not sure he’s willing to do — and the company may require a new leader.
Though I do have faith that if he wanted to take that path, he could do it, she said.
Haugen also said that she would work for Facebook again, if the company asked her back.
I believe this is the biggest problem in the world and we have to, have to, have to solve it.
CBC Radio, written by Padraig Moran. Produced by Ben Jamieson.