U.S. Regulators Propose New Online Privacy Safeguards for Children

The Federal Trade Commission (FTC) has proposed significant changes to the federal rule protecting children’s online privacy, in an effort to strengthen consumer privacy. The proposed changes aim to reinforce the rules set out by the Children’s Online Privacy Protection Act of 1998, which restricts online tracking of children by social media apps, video game platforms, toy retailers, and digital advertising networks.

The proposed updates include turning off targeted advertising by default for children under 13, prohibiting the use of personal data to induce children to stay on platforms longer, and strengthening security requirements for online services that collect children’s data. These changes would shift the responsibility of online safety from parents to digital services while curbing the use and monetization of children’s data. The public has 60 days to comment on the proposals before the commission votes on them.

Table of Contents

Overview

Introduction to the proposed online privacy safeguards for children

The Federal Trade Commission (FTC) has recently proposed a set of comprehensive changes to strengthen online privacy safeguards for children. These proposed changes are aimed at bolstering the rules of the Children’s Online Privacy Protection Act (COPPA) and shifting the burden of ensuring online safety from parents to digital service providers. The proposed changes outline various measures to restrict the collection and use of children’s personal data by online platforms.

Importance of bolstering children’s privacy online

Protecting children’s privacy online is of paramount importance in today’s digital age. With the increasing use of technology and online services by children, there is a growing concern about the potential risks associated with the collection and use of their personal data. Strengthening children’s online privacy safeguards is essential to ensure their safety, promote their well-being, and protect them from online threats such as targeted advertising, data breaches, and inappropriate content.

Proposed Changes

Disabling targeted advertising by default for children under 13

One of the proposed changes is to disable targeted advertising by default for children under the age of 13. This means that online services specifically targeted at children or known to have children on their platforms will no longer be able to use personal details to tailor ads to individual children. By disabling targeted advertising, children can have a safer online experience without being subjected to manipulative advertising practices.

Prohibition of using personal details to incentivize children to stay on platforms

The proposed changes also aim to prohibit online services from using personal details, such as a child’s cellphone number, to incentivize them to stay on their platforms for longer periods. This restriction aims to prevent online services from exploiting children by using their personal data to encourage prolonged engagement with their platforms.

Limiting push notifications based on personal data

Another proposed change is to limit the use of personal data for push notifications directed at children. Online services will no longer be allowed to send push notifications to children based on their personal data, such as their online behavior, without verifiable parental consent. This measure aims to protect children from excessive exposure to intrusive notifications and encourage responsible data usage by online platforms.

Strengthening security requirements for data collected by online services

The proposed changes also include strengthening security requirements for online services that collect and store children’s data. These requirements aim to ensure that online platforms have robust security measures in place to protect children’s personal information from unauthorized access, data breaches, and other security threats. By enhancing security requirements, the proposed changes aim to minimize the risk of data breaches and safeguard children’s privacy.

Limiting the retention of children’s data by online services

To further protect children’s privacy, the proposed changes also introduce limitations on the retention of children’s data by online services. Online platforms will be required to specify the length of time they can retain children’s data and implement measures to delete or anonymize the data once it is no longer necessary. This measure aims to reduce the risks associated with the long-term storage of children’s personal information and promote responsible data handling practices.

Restricting the collection of student data by educational-tech providers

The proposed changes also address the collection of student data by educational technology providers. Online services used in educational settings will be restricted from collecting students’ personal details for commercial purposes. Instead, such data collection will require explicit consent from schools and will only be permitted for educational purposes. This measure aims to protect students’ privacy and ensure that their data is not exploited for commercial gain.

Statement by Lina M. Khan

The need to safeguard children’s data

Lina M. Khan, the chair of the Federal Trade Commission, emphasized the need to safeguard children’s data in the online environment. She highlighted the importance of protecting children’s personal information from being excessively tracked, hoarded, and monetized by companies. By strengthening children’s online privacy safeguards, the proposed changes aim to place affirmative obligations on service providers to better safeguard children’s data and prevent the outsourcing of responsibilities to parents.

Affirmative obligations on service providers

The proposed changes outline affirmative obligations on service providers to ensure the protection of children’s personal data. These obligations include disabling targeted advertising by default for children under 13, prohibiting the use of personal details to incentivize children to stay on platforms, and limiting push notifications based on personal data. Additionally, the proposed changes emphasize the need for stronger security requirements and data retention limitations for online services. These affirmative obligations aim to enhance children’s privacy protection and promote responsible data practices by service providers.

Preventing the outsourcing of responsibilities to parents

Lina M. Khan also emphasized the importance of preventing the outsourcing of responsibilities to parents in ensuring children’s online privacy. By placing affirmative obligations on service providers, the proposed changes aim to shift the burden of online safety from parents to digital service providers. This measure recognizes the challenges parents face in monitoring and protecting their children’s online activities and ensures that service providers take necessary steps to safeguard children’s data without relying solely on parental oversight.

Background on COPPA

Overview of the Children’s Online Privacy Protection Act

The Children’s Online Privacy Protection Act (COPPA) is a federal law in the United States that safeguards children’s online privacy. Enacted in 1998, COPPA regulates the collection, use, and disclosure of personal information from children under the age of 13 by online services. The law requires online services that are directed at children or have knowledge of children using their platforms to obtain parental consent before collecting personal details.

Requirements for online services targeted at children

Under COPPA, online services targeted at children or known to have children on their platforms must adhere to specific requirements. These requirements include obtaining verifiable parental consent, providing clear and comprehensive privacy policies, and implementing reasonable security measures to protect children’s personal information. Online services must also provide parents with the option to review and delete their child’s personal data and must not condition participation in an activity on the provision of more personal information than necessary.

Challenges in enforcing the law by regulators

While COPPA provides a legal framework for protecting children’s privacy online, enforcing the law poses challenges for regulators. The rapidly evolving digital landscape and the emergence of new technologies make it difficult to ensure full compliance with COPPA. Regulators face the task of monitoring a vast array of online services and enforcing compliance across various platforms. Additionally, the complexity of data collection practices and the global nature of online services present challenges in enforcing the law effectively.

Fines paid by tech companies for violating COPPA

Despite the challenges in enforcement, regulators have taken action against tech companies for violating COPPA. Several major tech companies, including Amazon, Microsoft, Google, Epic Games, and Musical.ly (now known as TikTok), have paid multimillion-dollar fines to settle charges of COPPA violations. These violations include collecting personal information from children without parental consent, showing targeted ads to children without proper authorization, and retaining children’s data beyond the necessary retention period. These fines serve as a deterrent and highlight the importance of compliance with COPPA’s requirements.

Legal Actions and Concerns

Federal lawsuit against Meta for violating children’s privacy law

In October, a coalition of 33 state attorneys general filed a joint federal lawsuit against Meta, the parent company of Facebook and Instagram, for violating children’s privacy law. The lawsuit alleges that Meta’s age-checking system allowed millions of underage users to create accounts without parental consent. This legal action underscores the significance of ensuring compliance with children’s privacy laws and holding online service providers accountable for their practices.

Criticism of Meta’s age-checking system

Meta’s age-checking system has drawn significant criticism for its alleged failure to effectively verify the age of users. Critics argue that allowing minors to create accounts without proper age verification exposes them to potential risks and violates children’s online privacy protections. The concerns raised highlight the importance of robust age verification mechanisms and the need for stricter enforcement of age restrictions on online platforms.

Public concern over mental health and safety risks of online services

There is growing public concern over the mental health and safety risks associated with popular online services. Parents, pediatricians, and children’s groups raise alarm about social media content recommendation systems promoting harmful content related to self-harm, eating disorders, and plastic surgery to young girls. These concerns highlight the potential adverse effects of online platforms on children’s well-being and the need for stronger safeguards to protect their mental health and safety.

Inappropriate content shown to young girls on social media

The exposure of young girls to inappropriate content on social media platforms is a significant concern. Targeted ads and content recommendations have been criticized for promoting harmful and age-inappropriate material to young girls. This alarming trend raises questions about the responsibility of online service providers in ensuring age-appropriate content and protecting children from exposure to potentially harmful or inappropriate material.

Distraction of students by social media in schools

School officials also express concerns about the distraction caused by social media platforms in educational settings. The addictive nature of social media and the constant notifications can disrupt students’ focus and impede their learning experience. The potential impact of social media on students’ academic performance highlights the need for responsible use of online platforms and measures to minimize distractions in school environments.

Laws passed by states to restrict minors’ access to social media

States have taken action to address the risks associated with minors’ access to social media platforms. More than a dozen laws have been passed to restrict minors’ access to social media networks or pornography sites. However, these laws have faced legal challenges from industry trade groups, which argue that they infringe on free speech rights. The tension between protecting minors and preserving freedom of speech highlights the complex nature of regulating online services.

Review Process

Initiation of the review by the FTC in 2019

The FTC initiated the review process of the children’s privacy rule in 2019 to assess the effectiveness of the existing safeguards and identify necessary updates. The review aimed to gather insights from various stakeholders and evaluate the evolving landscape of technology and online services. The review process provided an opportunity for feedback and input from tech and advertising industry trade groups, video content developers, consumer advocacy groups, and members of Congress.

Input received from various stakeholders

Throughout the review process, the FTC received significant input from various stakeholders. Tech and advertising industry trade groups, representing companies like Amazon, Apple, Google, and Meta, provided recommendations and insights. Video content developers, consumer advocacy groups, and members of Congress also contributed their perspectives on the need for stronger children’s online privacy safeguards. The input received helped shape the proposed changes and ensure a comprehensive approach to protecting children’s privacy.

Length and details of the proposed changes

The proposed changes, resulting from the extensive review process, comprise a detailed set of provisions and requirements. The proposal spans over 150 pages and covers various aspects of children’s online privacy, including targeted advertising, personal data use, security requirements, data retention limitations, and student data collection restrictions. The proposed changes aim to address the current challenges and evolve the regulatory framework to effectively protect children’s privacy in the digital age.

Upcoming 60-day public comment period

The proposed changes will now undergo a 60-day public comment period, providing an opportunity for individuals and organizations to express their views and offer feedback on the proposed regulations. The FTC will consider the comments received during this period before finalizing the changes. The public comment period ensures transparency and inclusivity in the rulemaking process and allows for a comprehensive evaluation of the proposed changes.

Industry Reactions

Mixed response from industry trade groups

Industry trade groups have expressed a mixed response to the proposed changes. The Software and Information Industry Association, which includes members like Amazon, Apple, Google, and Meta, expressed gratitude for the FTC’s efforts to consider outside input. The association recognized that the proposed changes cited their recommendations and expressed interest in participating in the next phase of the process. On the other hand, NetChoice, representing companies like TikTok, Snap, Amazon, Google, and Meta, raised concerns about the proposed defaults and the potential impact on parental wishes. The diverse response from industry trade groups reflects the complex nature of balancing privacy protection with industry interests.

Gratitude for considering outside input

Industry trade groups, such as the Software and Information Industry Association, emphasized their gratitude for the FTC’s efforts to consider outside input during the review process. The inclusion of recommendations from industry stakeholders indicates a collaborative approach in shaping the proposed changes. By considering a wide range of perspectives, the FTC acknowledges the importance of industry expertise and fosters a constructive dialogue between regulators and online service providers.

Concerns about defaults that parents might not want

NetChoice, representing several tech giants, expressed concerns about the proposed changes setting defaults that parents might not want. The group suggested that the new rule overrides the wishes of parents by limiting access to necessary services. This concern reflects the delicate balance between parental control and ensuring adequate privacy protections for children. Striking the right balance is crucial to address privacy concerns while maintaining necessary online services for children.

Conclusion

Finalization of the proposed changes

The proposed changes to bolster children’s online privacy safeguards mark a significant step by U.S. regulators to strengthen consumer privacy, particularly for children. The FTC’s proposal outlines comprehensive measures to restrict the collection and use of children’s personal data, disable targeted advertising by default, strengthen security requirements, and limit data retention. The proposed changes aim to create a safer online environment for children and ensure responsible data practices by online service providers.

Implementation and compliance by online services

Once the proposed changes are finalized, online service providers will be required to comply with the new regulations. They will need to ensure their platforms adhere to the updated requirements, including disabling targeted advertising by default, implementing robust security measures, and obtaining consent for student data collection. Compliance with the new regulations will be crucial to protect children’s privacy and maintain trust in online services. Ongoing monitoring and enforcement by regulators will play a vital role in ensuring effective implementation and compliance with the strengthened online privacy safeguards.

Top Christmas Gifts for Kids Recommended by the WIRED Gear Team

Scroll to Top