Over the past few years, regulators around the world have stepped up enforcement of privacy laws that protect minors online. Recently fines and injunctions have been levied against video game companies, education platforms, social media networks, smart speaker manufacturers and other digital services providers for unlawful practices involving young people's personal data.

Legislators are active in this space as well. Lawmakers took a hard look at online features and content that enables, facilitates or promotes excessive information-sharing, bullying, self-harm, eating disorders, sexual or substance abuse, addictive behaviors and other potential harms. These outcomes are just some of the reasons governments are more strictly regulating online service providers' interactions with minors. While many legislators agree more online privacy and safety laws are needed to protect minors, jurisdictions are taking different approaches to this shared objective.

Some, such as the U.K. and California, have adopted age-appropriate design rules. These establish general principles that apply to virtually any business offering online services likely to be accessed by minors. The principles require data protection impact assessments, risk-appropriate age assurance measures (to estimate the ages of users), settings and digital experiences tailored to a user's estimated age, features designed with minors' best interests in mind, and clear limits on privacy-intrusive practices (such as targeted advertising, profiling and location-tracking). In the U.K. and California, these rules include a role for parents and guardians (collectively, "parents"). But, generally speaking, the onus falls upon companies to design their services in a way that proactively protects minors’ privacy and safety.

Other U.S. laws emphasize the responsibility of parents to act as gatekeepers to their child's internet access. The 1998 U.S. Children's Online Privacy Protection Act prohibits covered online service providers from processing certain minors' information without parental consent. New laws in Arkansas, Louisiana, Texas and Utah essentially ban social media services from letting minors use their features without parental consent. Connecticut also recently amended its omnibus consumer privacy statute to create new rights and obligations largely consistent with California's Age-Appropriate Design Code Act, except that the statute outlines a primary role of parents in their children's online experiences. Some recent laws focus on specific harms related to minors' online activities: Utah established a private right of action against social media platforms that are addictive to minors, and Florida enacted social media restrictions against profiling minors or using their personal data in ways that could result in substantial harms or privacy risks.

Privacy and safety laws that protect children and teens have appeared against the backdrop of omnibus data privacy and protection laws. Those laws impose general rules on processing personal data while establishing a handful of specific duties related to children's data. The EU General Data Protection Regulation, for example, includes a role for parental consent where the data subject is a child. The California Consumer Privacy Act imposes special consent requirements on covered businesses that sell or share children's personal data. Many of the growing number of state privacy laws in the U.S. include data from a known child in their definitions of "sensitive information." In those cases, specific consent, impact assessment and disclosure requirements come into play.

Under relevant laws, the age threshold for the definition of a child varies. Under COPPA, minors under 13 years are protected as children. Under the CCPA, businesses need to obtain parental consent to sell the personal data of children under 13 and express consent from a data subject ages 13-15 is required to sell their personal data. Under some of the more recent laws and draft bills, however, anyone under 18 years qualifies as a child. Wherever the higher age thresholds apply, businesses will likely find it difficult to conclude their sites are not directed to children, because many teenagers are interested in the same subjects and content as adults. Most companies will also not want to categorically prohibit all minors under 18 from accessing their applications and sites, even if they are prepared to keep out young children.

All companies that offer online services may find themselves in possession of minors' personal data. And so, companies that take part online should consider the following general recommendations, especially in light of the growing body of youth online privacy and safety laws.

1. Conduct youth impact assessments.

There is no one-size-fits-all solution to the responsible design and development of online experiences. Companies should undertake individualized impact assessments specific to their unique circumstances to gather facts about the types of personal data they collect about minors, how it is used, to whom it is disclosed, and the purposes of such use and disclosure. They should evaluate the applicable legal requirements based on how they engage with minors and those minors' experience with their services, identify the harms minors may encounter through their services, and develop a plan to implement safeguards that protect minors from those harms. If attorneys direct the assessments for purposes of providing legal advice, companies may be able to shield drafts and communications relating to risk assessments with attorney-client privilege.

2. Implement age estimation, verification or assurance mechanisms.

To comply with laws such as COPPA and new restrictions under certain state laws, companies must put age-verification mechanisms in place to ensure they do not collect information from minors under a certain age without parental consent. The age-appropriate design rules in the U.K. and California take a less prescriptive approach but nevertheless require companies to estimate the age of users to a reasonable degree of certainty.

Companies find it difficult to determine users' age with certainty or confirm the identity of parents. In the past, authorities accepted reliance on self-attestations where companies asked for birthday information in a neutral manner, for example asking "what is your month and year of birth?" as opposed to "confirm that you are 18 years or older."

In a development that more closely reflects how youths actually use the internet, many newer laws require additional efforts to ascertain the age of users. Companies that offer high-risk services to minors should consider age-gating backed up by government ID checks.

Other means of age assurance might include verification through additional data points (such as payments systems, email or phone number communications), self-attestations under penalty of perjury, or age estimation using biometric scans and database checks. Implementing age assurance mechanisms may trigger additional obligations and risks under privacy laws.

There is no industry standard age assurance mechanism yet. Some bodies have developed standards in the space, such as the Institute of Electrical and Electronics Engineers' P2089.1 standard.

3. Set "high privacy" by default.

Wherever possible, settings for minors should be "high privacy" by default. Companies should be particularly mindful of legal restrictions around the use of minors' personal data for delivering targeted or behavioral advertising, profiling to infer characteristics about them, tracking their precise geolocation, or streaming their real-time video footage or verbal statements.

Companies should also obtain a minor's opt-in consent, or the consent of their parent where appropriate, before allowing them to make their personal data available to others on a service.

Companies should also consider designing experiences with different functionalities for different age groups to mitigate minor-specific risks such as excessive purchases, unwanted contacts and dissemination, and inappropriate content.

4. Make legal language clear, concise and comprehensible.

Companies should regularly review and update their privacy notices and legal terms to ensure they align with current laws and regulations around youth online privacy. Relevant materials should be clear, concise, and written in a language that young people can understand.

Companies should consider surfacing just-in-time privacy notices, such as statements that explain exactly what a feature does when the user interacts with it. Legal terms intended to protect young users from inappropriate situations should be clear, and companies should put mechanisms in place to ensure these terms are enforced.

5. Adopt data minimization practices.

Companies should only collect from minors the minimum amount of personal data necessary to provide the core functionality of their services to young users.

They should ask only for necessary information during account creation. For instance, a gaming app may only need a username and password, rather than full name, address or school details.

Companies should not collect or store location data from a minor's device unless it is strictly necessary for their services. For example, an educational app would not typically need to know a child's exact location to function effectively.

Use of cookies or other tracking technologies that collect unnecessary personal data should be limited, particularly because minors may not fully understand how these technologies work.

Companies should regularly review data storage and delete any unnecessary data. This means minimizing the personal data a company collects but also minimizing the data it retains.

Deidentification and aggregation techniques should also be applied to minors' personal data before using it for ancillary purposes, including research and product development.

6. Use algorithms and artificial intelligence responsibly.

When deploying algorithms and AI technologies, especially those that interact with or collect data from minors, companies must ensure they follow responsible practices. Many of these practices will also apply to interactions with adults, but companies should be particularly mindful that minors, generally, have less experience and awareness than adults. Consequently, young people are often more prone to manipulation and misunderstanding. For example, companies should avoid using algorithms specifically to make their services more addictive to minors.

In addition, companies that use generative AI programs or use algorithms to recommend, promote, rank or filter content should ensure outputs are appropriate to the age of the user. Texas' online youth protection law, HB 18, for example, imposes strict requirements on companies that use algorithms, including to make copies of the algorithm code available to independent security researchers (except where they constitute trade secrets). California's age-appropriate design law specifically requires covered businesses to assess whether their algorithms could cause harm to minors.

7. Provide parental controls.

Many jurisdictions encourage or require the provision of parental controls for services aimed at children. Such controls let parents limit the data collected from their children. These tools empower parents to manage and monitor their child's use of the service, which can be a helpful way to mitigate risk and promote parental participation in a minor's use of online services.

Under California law, however, businesses must provide an obvious signal to children when they are being monitored or tracked, if the online service, product or feature allows parents to access their children's activities or locations.

Parental controls might include options to limit the amount of time a minor can spend on the service, restrict access to certain features (like in-app purchases or social interactions), or to review and approve any data that might be shared publicly, such as posts on a social network. Parental consent mechanisms should be robust and verifiable, so a platform verifies it is indeed the parent giving consent rather than the underage user.

8. Implement robust security measures.

Given its general sensitivity, protecting minors' data from breaches is a crucial pillar of COPPA and other laws. Companies should ensure they have reasonable and appropriate data security measures in place to prevent unauthorized or accidental disclosures or other processing.

Appropriate measures may include using encrypted data at rest and in transit, implementing secure access controls, conducting regular security audits and vulnerability assessments, and having incident-response plans in place for suspected breaches. Regular updates and patches should also be applied to all systems to guard against known security vulnerabilities.

Furthermore, companies should adopt a "security by design" approach, meaning security considerations are integrated into the product development process from the outset, rather than as an afterthought. Strong security measures are critical to maintaining user trust, protecting minors' privacy and meeting regulatory requirements.

Moving forward

Global regulations governing the delivery of online services to minors are rapidly evolving, and companies should monitor legal developments closely. Misusing minors' personal data and exposing them to harmful outcomes can lead not only to regulatory enforcement but also significant reputational harm. Following these eight recommendations can help companies address some of the most significant legal risks, child safety threats and well-being concerns associated with providing online services to minors.


This article was first published by the IAPP.

Explore More Insight