The views expressed by contributors are their own and not the view of The Hill

Kids’ programming on TV is regulated — what about their digital devices?

The media landscape has undergone a seismic shift since the passage of the Children’s Television Act of 1990. Saturday morning cartoons on television are a relic of the past. In addition to concerns about commercials and educational value, parents now worry about exposure to inappropriate and disturbing content as kids watch videos on smart TVs, computers, tablets and phones. We’ve allowed major corporations prioritizing fast growth and profits to conduct an ongoing experiment on our kids.

The Federal Communications Commission (FCC) has a proposal to revise the rules on children’s programming. For the first time in almost 30 years, policymakers, industry and children’s advocates have an opportunity to put forward solutions to safeguard children’s programming.

{mosads}This week, Sen. Edward Markey (D-Mass.) proposed the Kids Internet Design and Safety Act, federal legislation strongly supported by the nonprofit that I head, Common Sense. The bill would extend the values and protections of the Children’s Television Act for today’s digital age. It will create rules to cover all media platforms, placing limits on advertising and ensuring platforms address the root issue of algorithms that push unhealthy content to kids, as well as  manipulative and addictive design features that keep kids glued to the screen.

The rules of the past need to be updated to work in today’s 24-hour streaming world. A recent Washington Post article underscores this problem by shedding light on how media platforms such as YouTube say they are protecting younger users but continue to use algorithms and auto-play, which end up delivering inappropriate content to kids. Kids are regularly exposed to violence, self-harm, profanity, porn, hate speech or even a mass shooting live-streamed from a New Zealand mosque.

Unprepared for the rapid changes in technology, and the wide-ranging impacts, political leaders and government regulators have struggled to keep up with the meteoric rise of technology and its impacts on kids’ digital well-being. But a slew of data breaches and privacy violations amidst public outrage about a growing toxic media environment are forcing action.

California is leading the way in the United States with the passage of the landmark Consumer Privacy Act, which goes into effect in 2020 and gives consumers more information and control over the vast data tech companies collect. The European Union passed an even more robust data privacy law called the General Data Protection Protection (GDPR).

But far stronger action is needed on a much wider scale.

The Kids Internet Design and Safety Act is a step toward addressing the much larger and more pressing reality we all face today — the growing influence of tech on our kids and its unintended consequences. Ninety-eight percent of kids under 8 have access to a mobile device at home. Teens use an average of nine hours of media daily. Kids and teens are exposed to everything from online advertising and inappropriate content to data collection without consent, misinformation and cyberbullying. The impact that many tech platforms have on the social, emotional and cognitive development of kids must be addressed.

Technology can be a powerful tool with many benefits, but parents and consumers have put their trust in the hands of companies that for years have not shown a willingness to take action that may restrict their ability to collect and profit from data. It’s time for policymakers to step in with major protections that can be incorporated into the design of technologies from day one. In addition to tighter privacy controls, platforms need far stronger policies and procedures to protect kids and teens from inappropriate content.

Our current policies are woefully out of date and the industry has forged ahead nearly unfettered.  The time is now for comprehensive, enforceable rules that reflect the current media landscape to inspire more quality and age-appropriate content, safeguard children’s programming and ensure the well-being of kids and generations to come. The time is now for the Kids Internet Design and Safety Act.

James P. Steyer is CEO of Common Sense, a national nonprofit advocacy organization for kids in the digital age. Follow him on Twitter @jimsteyer.