State Watch

NYC suing social media companies for effects on youth mental health

FILE - Mayor Eric Adams speaks during a press conference at City Hall in New York, Tuesday, Dec. 12, 2023. New York City’s teachers union is suing to block planned cuts to the city’s public schools, warning that steep budget reductions proposed by Mayor Eric Adams would weaken key education initiatives and violate state law. (AP Photo/Peter K. Afriyie)

New York Mayor Eric Adams (D) announced Wednesday a lawsuit targeting five major social media platforms, alleging the companies that host the platforms are responsible for “fueling the nationwide youth mental health crisis.”

The lawsuit, filed in California by the City of New York, seeks accountability from the companies behind five major platforms: Meta’s Facebook and Instagram, Snap’s Snapchat, ByteDance’s TikTok, and Google’s YouTube.

“Over the past decade, we have seen just how addictive and overwhelming the online world can be, exposing our children to a non-stop stream of harmful content and fueling our national youth mental health crisis,” Adams wrote in a press release.

“Today, we’re taking bold action on behalf of millions of New Yorkers to hold these companies accountable for their role in this crisis, and we’re building on our work to address this public health hazard. This lawsuit and action plan are part of a larger reckoning that will shape the lives of our young people, our city, and our society for years to come,” Adams added.

An official press release said the lawsuit aims “to force tech giants to change their behavior and to recover the costs of addressing this public health threat.” The press release noted that the city spends more than $100 million each year on mental health programs for the youth.

The 305-page complaint makes several lofty claims against all defendants, before detailing allegations against individual companies.

The lawsuit claims the defendants have “created” a youth mental health crisis and claims the companies “could have avoided harming NYC Plaintiffs.”

The defendants have specifically targeted school-aged children as “a core market,” while noting that “millions of kids” use the social media platforms “compulsively” and do so during school hours.

The plaintiffs also argue in the lawsuit that the social media platforms were “designed, developed, produced, operated, promoted, distributed, and marketed” to “attract, capture, and addict youth, with minimal parental oversight.”

The lawsuit comes amid a wave of legal cases seeking to hold accountable the social media platforms that they allege contribute to rising mental health issues among children and teens.

In response to the lawsuit, representatives from the social media companies largely denied the allegations in the complaint and have stressed the safety and privacy features that they’ve developed in recent years.

“Providing young people with a safer, healthier experience online has always been core to our work,” Google spokesperson José Castañeda said in a statement reported by Axios. “In collaboration with youth, mental health and parenting experts, we’ve built services and policies to give young people age-appropriate experiences, and parents robust controls. The allegations in this complaint are simply not true.”

A Meta spokesperson said in a widely reported statement, “We want teens to have safe, age-appropriate experiences online, and we have over 30 tools and features to support them and their parents … We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.”

A Snapchat spokesperson said in a statement, “Snapchat was intentionally designed to be different from traditional social media, with a focus on helping Snapchatters communicate with their close friends. Snapchat opens directly to a camera — rather than a feed of content that encourages passive scrolling — and has no traditional public likes or comments. While we will always have more work to do, we feel good about the role Snapchat plays in helping close friends feel connected, happy and prepared as they face the many challenges of adolescence.”

A TikTok spokesperson said, “TikTok has industry-leading safeguards to support teens’ well-being, including age-restricted features, parental controls, an automatic 60-minute time limit for users under 18, and more. We regularly partner with experts to understand emerging best practices, and will continue to work to keep our community safe by tackling industry-wide challenges.”