A former Facebook data scientist has told Congress that the social network’s products harm children and fuel polarisation in the US, and its executives refuse to make changes because they elevate profits over safety.
Frances Haugen was giving evidence to the Senate Commerce Subcommittee on Consumer Protection, after accusing the company of being aware of apparent harm to some teenagers from Instagram and being dishonest in its public fight against hate and misinformation.
Ms Haugen has come forward with a wide-ranging condemnation of Facebook, with tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.
She has also filed complaints with federal authorities alleging that Facebook’s own research shows that it amplifies hate, misinformation and political unrest, but the company hides what it knows.
Ms Haugen says she is speaking out because of her belief that “Facebook’s products harm children, stoke division and weaken our democracy”.
“The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people,” she said in written evidence prepared for the hearing.
“Congressional action is needed. They won’t solve this crisis without your help.”
After recent reports based on documents she leaked to Wall Street Journal raised a public outcry, Ms Haugen revealed her identity in a TV interview aired on Sunday night. She said: “Facebook, over and over again, has shown it chooses profit over safety.”
The ex-employee challenging the social network with 2.8 billion users worldwide and nearly a trillion dollars in market value is a 37-year-old data expert with a degree in computer engineering and a master’s degree in business from Harvard.
Before being recruited by Facebook in 2019, she worked for 15 years at tech companies including Google, Pinterest and Yelp.
The panel is examining Facebook’s use of information from its own researchers on Instagram that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts.
For some teenagers, the peer pressure generated by the visually focused Instagram led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Ms Haugen showed.
One internal study cited 13.5% of teenage girls saying Instagram makes thoughts of suicide worse and 17% saying it makes eating disorders worse.
“The company intentionally hides vital information from the public, from the US government and from governments around the world,” Ms Haugen said in her written evidence.
“The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages.”
As the public relations row over the Instagram research grew last week, Facebook put on hold its work on a children’s version of Instagram, which the company says is meant mainly for 10 to 12-year-olds.
At issue are algorithms that govern what shows up on users’ news feeds, and how they favour hateful content.
Ms Haugen, who focused on algorithm products in her work at Facebook, said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together.
Despite the enmity that the new algorithms were feeding, Facebook found they helped keep people coming back — a pattern that helped the social media giant sell more of the digital ads that generate most of its revenue.
Ms Haugen also says Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump last year, alleging that contributed to the deadly January 6 assault on the US Capitol.
After the November election, Facebook dissolved the civic integrity unit where Ms Haugen had been working, and she says that was the moment she realised “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous”.
Facebook maintains her allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarisation.