Facebook whistleblower Frances Haugen stated at a Senate panel Tuesday to Congress it must rapidly solve the “crisis” created by her former employer’s products and services.
Frances was former Facebook product manager for civic misinformation. Frances told lawmakers that Facebook consistently puts its own profits over it’s users’ health and safety, which is largely a result of Facebook algorithms that compels users toward high-engagement posts that in some cases are more harmful. Though Frances stopped short of accusing top executives of intentionally creating harmful products, she said that ultimately, CEO Mark Zuckerberg had to be responsible for the impact of his business.
Frances also said that Facebook’s algorithm could steer young users from something relatively innocuous like healthy recipes to content promoting anorexia in a short period of time. She proposed a solution for Facebook to change its algorithms to stop focusing on delivering posts that create more engagement and instead create a chronological feed of posts for Facebook users. That, she said, would help Facebook deliver safer content.
Haugen, who revealed herself Sunday as the source behind leaked documents at the core of a revealing Wall Street Journal series about Facebook, testified before the Senate Commerce subcommittee on consumer protection. Haugen told “60 Minutes” in an interview aired this weekend that the problems she saw at Facebook were worse than anywhere else she’d worked, which includes Google, Yelp and Pinterest. She told the news program that she copied tens of thousands of pages of internal research that she took with her when she left Facebook in May.
“I saw that Facebook repeatedly encountered conflicts between its own profits and our safety,” Haugen said in her written testimony. “Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism and polarization — and undermining societies around the world.”
In her statements Haugen Frances believes she did the right thing in coming forward, but is aware Facebook could use its immense resources to “destroy” her.
“I came forward because I recognized a frightening truth: almost no one outside of Facebook knows what happens inside Facebook,” Haugen said in her written remarks. “The company’s leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world.”
Haugen said a turning point that convinced her of the need to bring information outside of Facebook was when the company dissolved the civic integrity team after the 2020 U.S. election. Facebook said it would integrate those responsibilities into other parts of the company. But Haugen said that within six months of the reorganization, 75% of her “pod” of seven people whom had mostly come from civic integrity left for other parts of the company or left entirely.
“Six months after the reorganization, we had clearly lost faith that those changes were coming,” she said.
Facebook’s Impact on young users
The hearing was focused on Facebook’s impact on young users. Lawmakers expressed outrage at one of the Journal’s reports that said Facebook’s internal research had found Instagram created a horrific environment for some teen girls already experiencing negative feelings about their bodies. That prompted even stronger calls from lawmakers for Facebook to end plans to launch a version of Instagram for kids. Facebook has accused the Journal of cherry-picking data, emphasizing that research showed that a majority of users surveyed in several cases found positive effects of using its products, even when a small percentage felt it made their negative feelings worse.
Facebook has said that in a survey, eight out of 10 teen Instagram users in the U.S. said the platform made them feel better or had no change on their feelings about themselves. But Haugen testified Tuesday that the other 20% remaining is still significant on a platform while boasting billions of users worldwide.
“In the case of cigarettes, ‘only’ about 10% of people who smoke ever get lung cancer,” Haugen said. “So the idea that 20% of your users could be facing serious mental health issues and that’s not a problem is shocking.”
Though Facebook announced a momentary pause on its Instagram for kids plans, Haugen told senators she would be “sincerely surprised” if Facebook stops working on the product.
“Facebook understands that if they want to continue to grow, they have to find new users,” Haugen said, adding that means ingraining kids with habits.
Allegations of Lies and under-staffing
Along with her disclosures to the U.S. Senate and the Journal, Haugen also filed complaints with the SEC. It Claims Facebook lied to investors and advertisers by omitting or misrepresenting what it knew about how its platforms were being used, like to spread misinformation, and the measures it was taking to combat that.
Haugen said Tuesday that Facebook gave talking points to advertising staff following the January 6 insurrection at the U.S. Capitol assuring advertisers that Facebook was doing everything it could to make the platform safer, including by taking down all hate speech they find. Haugen said this was not true.
Though she called on lawmakers to impose regulations on Facebook, she warned in her testimony that “Tweaks to outdated privacy protections or changes to Section 230 will not be sufficient,” referring to the legal shield that protects online platforms from liability for their users’ posts. She also said she believes a healthy social media platform is possible to achieve and that Facebook presents “false choices … between connecting with those you love online and your personal privacy.”
“The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said in her testimony, saying transparency is the right first step.
She told lawmakers that she consistently saw teams at Facebook be understaffed, which prompted “an implicit discouragement from having better detection systems.” She said if Facebook had even a basic detector on the counter-espionage team on which she worked, they would be able to pick up on many more cases than they already handled.
Similarly, she added that Facebook could do “substantially more” to detect children on their platform and should have to publish those processes for Congress. She said Facebook has the ability to see more under aged kids on the platform even if they lie about their ages.
Haugen also said that while she worked there, the counter-espionage team tracked Chinese participation on Facebook tracking the ethnic minority Uyghur population. She said the “consistent under staffing” of such teams is a national security concern and that she is speaking with other parts of Congress about it. Blumenthal said that topic was ripe for another hearing topic.