Democrats and Republicans agree: Kids are addicted to social media and government can help
A polling memo crossed my desk this week from the conservative messaging guru Frank Luntz. It wasn’t about winning political battles or campaigns, but something far more consequential. It begins: “Of all the projects I’ve worked on over the past decade, none has more moms more concerned than social media and artificial intelligence.”
That should grip conservatives and progressives alike.
Luntz presented his research findings to a bipartisan nonprofit group, the Council for Responsible Social Media. (Full disclosure: I’m a volunteer board member.) He told us, “When moms tell me in focus groups that their children become petulant, sullen, even threatening when they try to take their cell phone away or limit access to social media, I listen — for those are signs of physical and emotional addiction. And when moms tell me they can no longer have productive conversations with their own children, I listen, because it may already be too late.”
Unlike most issues examined by pollsters, this issue is deeply personal. Luntz’s research found that about 64 percent of people ages 18-29 know someone who has been “damaged by social media.” Nearly half of all mothers know someone who’s been hurt. That’s hitting home, and it is hitting hard.
What fascinates me about the research is that it crosses fairly evenly across party lines. 47 percent of Republicans and 55 percent of Democrats believe that “children’s addictive relationship with social media makes them worse at interacting with people face-to-face.” There’s also consensus in both parties that social media “weakens children’s ability to think for themselves, robbing them of their social skills.”
Here’s another concern that cuts evenly across both parties: One-third of Americans say they can’t tell what’s true and not true on social media platforms. As Luntz notes: “At a time when ‘the truth’ is the single highest priority among the American population, this finding is among the most alarming.”
Artificial intelligence (AI) is especially worrying to Americans. Nearly half are concerned about its unintended consequences. Luntz writes: “I’ve heard some people in Washington wanting to go slow in creating AI safeguards, not wanting to act in haste. The public is of the opposite opinion: they want action now, before it is too late.”
And who should take that action? Another surprise: When asked if the government should address the potential impact of social media, only 23 percent of the public responded no. When it comes specifically to AI, a wide margin (62 percent to 38 percent) of the public prefers action now, rather than “waiting and worrying” about stifling innovation.
Of course, “government regulation” has become a war cry in political attack ads, as sharp and lethal as “tax and spend.” But it’s blunted when applied to the issue of artificial intelligence. Luntz’s research tested effective framing to support responsible federal oversight and protections. The one that works best: “establish a government review process similar to potentially addictive FDA medicines.”
Polls are snapshots, a form of artistic impressionism where the subject is coaxed and goaded to arrive at certain judgments that are measured and weighed with margins of error. Focus groups, however, provide measures of depth into what really matters: opinions, thoughts, feelings. Luntz’s group tells stories of real people with deep anxieties who want their government to help protect their children from a serious mental health crisis. One respondent described social media as “a serious virus that is attacking people. It infects every aspect of our lives from how we get news to how we relate to our peers, our families and our communities.”
Yet another volunteered: “Until Congress puts laws in place for restrictions on using social media, it’s going to act like a vampire and suck the happiness out of everything.”
These people aren’t asking government to co-parent. They are asking government to facilitate responsible behavior by social media and AI platforms so that raising their kids isn’t a constant losing battle. Drug addictions require stealth, money. TikTok requires at least one finger and a tablet. As one focus group participant said, “I’m exhausted every day. It’s never ending.”
This week, Congress seemed to listen. The Senate Commerce Committee passed two bipartisan bills that would strengthen online privacy protections for children and require social media companies to design their platforms to better protect them: The Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act.
Over a quarter-century ago, when online platforms emerged and spread, policymakers grappled with how to predict the internet’s impact. At the time, it seemed to make sense to pass Section 230, which still protects internet service providers and users from liability for acts that others commit on their websites or online forums, even if the provider fails to take action after receiving notice of the harmful or offensive content. Ironically, the legislative genesis of Section 230 was “The Internet Freedom and Family Empowerment Act.”
Now we know: America’s families feel robbed of their freedom and powerless against the addictive algorithms and the warped reality of social media and AI Parents see it in the glazed eyes and craned necks of their children; the slow, discernible softening of their social skills; the creeping isolation and insulation; the addictive desire not to be good but to be “liked.”
This time, the public, across the partisan divide, seems to be ahead of the policymakers. As Luntz writes: “The public wants action — and moms even more.”
And one thing we know is that moms know best. And they vote.
Steve Israel represented New York in the U.S. House of Representatives over eight terms and was chairman of the Democratic Congressional Campaign Committee from 2011 to 2015. He is now director of the Cornell Jeb E. Brooks School of Public Policy Institute of Politics and Global Affairs. Follow him on Twitter @RepSteveIsrael.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.