California is right: Addictive tech design is not free speech
The Northern District of California’s injunction against the Age-Appropriate Design Code (AADC), stating it infringes upon the First Amendment, demands our immediate concern and response.
Indeed, last week, California Attorney General Rob Bonta filed a notice of appeal against the injunction, calling the court’s decision “wrong,” and saying California “should be able to protect [its] children as they use the internet.”
By incorrectly interpreting the definition of “free speech” and expanding it to encompass addictive technology design features, this injunction paves the way for potential dangers that could stall — or even reverse progress — in safeguarding the well-being of our youth.
Unlike previous battles over privacy rules, design regulation is an area that is still in its infancy and in which tech companies have avoided any meaningful oversight. The linchpin of what makes technology addictive lies in its design: the endless scrolling, the intermittent rewards, the dopamine hits. These designs are crafted explicitly to keep users engaged, thumbs scrolling and eyes glued to screens.
Yet, such elements should not be conflated with free speech rights and must be distinguished by the court from content regulation, something not proposed or required by the Age-Appropriate Design Code. If we set a precedent now that equates these manipulative design features with First Amendment protections, we will effectively disarm our legislators from ever being able to protect our children from the known harms of these platforms.
We are faced with a daunting scenario: Do we prioritize the interests of tech corporations over the well-being and mental health of our children? To be clear, the First Amendment was crafted to protect freedom of speech, not manipulative and predatory designs.
The Age-Appropriate Design Code’s unanimous passage was rooted in overwhelming evidence about the detrimental effects of unregulated technology on young minds. Numerous studies have drawn direct links between increased screen time and rising rates of anxiety, depression and even suicidal tendencies among youth.
This concern isn’t based merely on external research; it’s backed by inside information. As highlighted by whistleblowers and investigative pieces like those from The Wall Street Journal, companies like Meta were fully aware that their algorithms were causing harm, especially to young girls.
A pressing concern is the current opacity of how Big Tech runs its platforms. These conglomerates remain tight-lipped about the impacts of their products (which we know they know about through various product experiments that they run internally), which hampers not only parental decision-making but also scholarly research aimed at understanding and rectifying problems. The reason for this is clear, they know that once their actual internal knowledge of their harmful design comes to light, they will face liability and extreme public disapproval.
As they continue to hold their information privately, research continues to expose the truth behind their products. The U.S. Surgeon General’s Advisory on Youth Mental Health explicitly urges tech companies to pivot towards more responsible models that prioritize the health and well-being of users over profit, calling for more transparency, the inclusion of more robust health metrics and the fostering of digital environments that are both safe and inclusive.
Recommendations from the advisory are clear and poignant. From adopting precautionary approaches in product development to ensuring tools that promote healthy online engagement, the emphasis is on creating spaces that actively uphold users’ mental and emotional health. The Age-Appropriate Design Code and other proposed bills across the country do just as the surgeon general urges, require tech companies to evaluate if their design harms children and mitigate that harm when identified internally.
When tech companies argue against policies such as the Age-Appropriate Design Code, they twist the narrative to serve their purposes. A growing chasm is evident between public perceptions of the First Amendment’s purpose and the interpretations emerging from courtrooms. The spirit of the Constitution is being forgotten, replaced with a corporatized version that prioritizes profits over people, and shareholders over children’s safety.
We find ourselves at a crossroads. On one hand, the boundless potential of digital technology, and on the other, the possible derailment of an entire generation’s mental health. We must ask ourselves what we value more, children’s safety or the rights of corporations to act with impunity?
The injunction and its citing of First Amendment violations is a dangerous step away from safeguarding our children’s futures. The design of platforms should not hide behind the veil of “free speech.” When protecting our children is at stake, we must be prepared to challenge and change this emerging narrative.
Gaia Bernstein is a technology, privacy and policy professor of law, co-director of the Institute for Privacy Protection, and author of “Unwired: Gaining Control over Addictive Technology.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.