A generation ago, tobacco executives stood before Congress and swore under oath that cigarettes were not addictive. Their testimony was patently false, and a photo of their swearing-in became infamous.
Today, we must come to grips with a force as pernicious as the 20th century tobacco industry: a social media ecosystem that is as dangerously toxic as it is addictive, with young people being the most vulnerable. If social media does not change its ways, it will soon become today’s Big Tobacco.
As five tech CEOs prepare to testify before the Senate Judiciary Committee today, the parallels are undeniably striking.
U.S. Surgeon General Vivek Murthy has been sounding the alarm on social media’s impact on teen mental health, just as C. Everett Koop, a surgeon general in the 1980s, did on the dangers of smoking. Murthy said young people tell him three things about social media: “It makes them feel worse about themselves. It makes them feel worse about their friendships. And they can’t get off it.”
We all remember the founding purpose of social media. It helped us reconnect with old friends, share family updates and find like-minded people in hopes of creating a more curious, connected and compassionate world. At the same time, social media has also made us more distracted, depressed and divided. In many cases, it focuses on our differences and perceived faults to maximize view time.
Artificial intelligence is what companies leverage to maximize view time. AI rapidly and accurately determined that our brains are wired to linger longer on social media posts that trigger the darker aspects of human nature — emotions like fear, anger, lust and envy.
I witnessed this phenomenon playing out on Pinterest. I joined Pinterest a year and a half ago because I was drawn to the work it had done to purposefully avoid some of the more toxic elements of social media. I was intent on making emotional well-being even more central to the company’s purpose.
But soon after I started, we found that Pinterest’s recent pivot to short-form video and AI-driven feed optimization had begun surfacing much of the same triggering content as the rest of the industry.
In response, we set out to give the AI a new objective: to optimize for inspiring content and positivity and to mix in more intentional choices by giving users more control over what they see. When we retrained the AI, the content recommendations became much more positive and action-oriented: step-by-step guides, self-care ideas and inspirational quotes rose to the top.
Unfortunately, a few months later, a solid piece of investigative journalism found that grown men were following young girls on Pinterest and curating otherwise innocuous content into sex-themed boards. Even worse, our algorithms recommended more of this content to them.
While we moved aggressively to improve teen safety features, the episode rocked us and demonstrated how much work is needed across the industry to guide AI to the right outcomes, especially in social media.
Unlike the tobacco industry decades ago, we actually need to hold ourselves accountable. The harm will be substantial if we don’t. It’s time for industry leaders to accept responsibility and build online platforms centered on positive well-being outcomes.
Big Tech doesn’t need to be the next Big Tobacco. We have the opportunity to disrupt current business models before we lose more young people to negativity and self-loathing.
But how can we do this?
We start by flipping the script: Put AI at the center of improving youth online safety and wellbeing. Unlike nicotine in tobacco, there are positive uses for AI in social media.
First, social media platforms should use AI to exclude far more problematic content than they currently do. They must train their AI to be more additive rather than addictive, and give users more control over what they see.
Second, platforms can leverage AI to bring more good to product design. For example, at Pinterest, we are using body type and skin tone range technology to shape our algorithms to increase representation across related feeds and search results. We also do not have image-altering effects or filters that reinforce unrealistic beauty standards.
Third, platforms can add special protections for teens. We have more restricted messaging for teens and added resources for those who feel anxious, stressed or sad. And we’ve taken a leading position by making accounts of teens under 16 private.
Fourth, we can work together by being transparent and sharing learnings on how to address these issues. We were the first to sign the Inspired Internet Pledge — a call to action for tech companies and advertisers to come together to make the internet a safer and healthier place for everyone, especially young people.
A successful social media business model that is focused on positivity is possible. I know this because we are building it here and seeing results. Pinterest revenue was up 11 percent in our last reported earnings, our users are engaging more than ever before and we saw a 20 percent year-over-year increase in Gen Z users between 2022 and 2023.
We are not perfect and don’t have all the answers. Like the rest of the industry, we have a long way to go. But it is my intention to make Pinterest a safe place for everyone — especially young people.
As leaders, we must ask ourselves: Are we going to be more upset by negative press stories and being called to testify before Congress, or by the fact that young people are being harmed? If the answer is the latter, then we must take accountability and come together to raise the bar on safety.
Bill Ready is the CEO of Pinterest.