A former Meta employee told a Senate panel Tuesday that the company’s top executives dismissed warnings about teens on Instagram facing unwanted sexual advances and widespread bullying.
The allegations from Arturo Béjar, a former Facebook engineer who later returned to the company as a consultant, renewed a push for a bipartisan child online safety bill that would regulate Meta and other social media giants.
The hearing highlighted the rare bipartisan support in the Senate for the issue of kids’ safety online. Senators on both sides of the aisle doubled down on the need for Congress to take urgent action, specifically rallying around the Kids Online Safety Act (KOSA).
“No parent or child can trust Facebook, or Meta, after this whistleblower’s powerful account, laying bare their denial and deception,” said Sen. Richard Blumenthal (D-Conn.), chair of the subcommittee and a lead sponsor of KOSA.
“Congress must act. It must pass the Kids Online Safety Act,” Blumenthal added.
Sen. Josh Hawley (R-Mo.), the ranking member of the subcommittee, said it was time for Congress to take action “years ago,” and joined Blumenthal in calls for the bill to be brought to the floor this year.
Béjar, who revealed his allegations in a Wall Street Journal report last week, alleged Meta executives know about the harm to kids and potential mitigation solutions, but chose not to act.
“We cannot trust them with our children and it’s time for Congress to act,” Béjar said.
After a stint as Facebook, Bejar returned to Meta in 2019 for a two-year consulting contract after seeing inappropriate content his teen daughter and her friends encountered on Instagram.
At the core of Béjar’s allegations is a note he sent to the company’s CEO Mark Zuckerberg, then-chief operating officer Sheryl Sandberg, chief product officer Chris Cox, and head of Instagram Adam Mosseri in October 2021.
The email included results from surveys Béjar and his team conducted about the experience teens faced on Instagram, according to a copy of the email released by the Senate subcommittee. It was first reported by the Journal.
Roughly 22 percent of Instgram users between 13 and 15 years old said they were the target of bullying, 39 percent said they experienced negative comparison, and about 24 percent said they received unwanted advances.
Béjar’s note acknowledged the gap between the data his team gathered and the data that Meta had been reporting about the prevalance of similar instances.
Béjar said he did not hear back from Zuckerberg.
Meta pushed back strongly on Béjar’s allegations and assessment. Company spokesperson Andy Stone said that there have been changes made to Meta-owned platforms as a result of surveys like the ones Béjar highlighted.
“Every day countless people inside and outside of Meta are working on how to help keep young people safe online. The issues raised here regarding user perception surveys highlight one part of this effort, and surveys like these have led us to create features like anonymous notifications of potentially hurtful content and comment warnings,” Stone said in a statement.
Unlike the surveys like those conducted by Béjar and his team, Meta’s public data on the prevalence of harmful content is based on how many times such posts and images are viewed by users. Stone said the types of data are different and not in conflict of each other.
Béjar’s allegations are adding to the mounting scrutiny over how Meta and other social media giants impact youth mental health. Calls for action have been building in the last two years, since another former Facebook employee, Frances Haugen, revealed internal company research including reports about the impact of Instagram on teens.
Béjar said Instagram is a product “like ice cream, or a toy, or a car.”
“I ask you, how many kids need to get sick from a batch of ice cream or be hurt by a car before there’s all matters of investigation?” he said.
KOSA, which would add guardrails that aim to mitigate harm to minors online, advanced out of the Senate Judiciary Committee in July. It advanced last year, as well, but did not come to the Senate floor.
The bill would require social media platforms to provide options for minors to protect their information, disable addictive product features and opt out of algorithmic recommendations.
It would also create a duty of care for social media platforms to prevent and mitigate harm to minors, such as through content promoting suicide, eating disorders or substance abuse.
Although it has bipartisan support, the proposal is still facing opposition from LGBTQ advocacy groups over concerns it could block minors from information and content online about the LGBTQ community and health care.
Dozens of LGBTQ organizations signed a letter last week about the concerns, specifically in states in which attorneys general “seek to aggressively sensor positive, enriching content that they particularly deem to be offensive or harmful to minors.”
Blumenthal said lawmakers are making “modifications and clarifications” to the legislation while working with the LGBTQ community.
“We are very responsive to their concerns this measure is not about content or censorship. It is about the product design that drives that toxic content at kids,” Blumenthal told reporters.
“We’re not trying to come between kids and what they want to see but simply enable them to disconnect from algorithms when it drives content that they don’t want.”