Technology

YouTube rolls out new child content policy

YouTube on Monday rolled out a series of changes to its content policies aimed at protecting children on the platform in an effort to appease federal regulators who fined the company millions for alleged privacy violations last year.

The changes, first introduced last September, were fully rolled out on Monday. The Google-owned company will now restrict the collection of data from people who watch videos meant for children, whether or not the viewers are children themselves.

YouTube will also stop running targeted ads on content for minors.

The decision on what content falls under these new rules will be made primarily by content creators. As of Monday, creators will have to designate whether videos are made for children during the uploading process.

“We also use machine learning to help us identify this content, and creators can update a designation made by our systems if they believe it is incorrect,” the company said in a blog post Monday, clarifying that YouTube can label a video as made for kids even if its creator does not.

“We will only override a creator designation if abuse or error is detected.”

YouTube will also begin running ads for its platform designed for children, YouTube Kids, which was launched in 2015 with stripped-back features designed to make it safer for young users.

The changes all come in response to a $170 million settlement with the Federal Trade Commission (FTC) over alleged violations of the Children’s Online Privacy Protection Act (COPPA).

The FTC alleged in that case that YouTube violated COPPA, which requires companies to obtain parental consent before collecting data on users under the age of 13, by collecting the personal information of users who watched videos that were clearly directed toward children and then using that data for targeted advertising.