The Information Commissioner’s Office (ICO) has published its Age-Appropriate Design Code of Practice for Online Services (Code), following public consultation last year.

Who should read this?

  • Anyone that designs or develops online services will need to consider whether the Code applies, including services such as apps, connected toys and devices (including any devices which are likely to be used by a child), search engines, social media platforms, streaming services, online games, news or educational websites and websites that offer goods or services over the internet.
  • The Code does not just apply to services specifically directed at children. If your service is likely to be used by anyone under the age of 18, the Code will apply.

What is the Code about?

  • The ICO has stated that use of children’s data is one of its regulatory priorities and that complying with the standards set out in the Code is a key measure for demonstrating compliance with data protection laws.
  • At the heart of the Code is the principle that the best interests of the child should be the primary consideration when designing and developing apps, games, connected toys/devices and websites that are likely to be accessed by children.
  • The Code applies to personal data collected directly from children as well as “inferred data” about a child. The Code applies to both new and existing services, so providers of existing service must start preparing now in order to be ready to comply with the Code when it enters into force.
  • The Code is intended to instigate significant change not just in the UK, as the Information Commissioner notes in her foreword that the Code is “the first of its kind” and intended to “lead to changes in practices that other countries are considering too”.
  • We have summarised below the status of the Code and when organisations need to comply, the types of services that the Code applies to, the practical steps organisations will need to take, the standards set out in the Code, and the consequences of non-compliance.

What is the status of the code and when do organisations need to comply?

  • The Code is a statutory code of practice required to be prepared by the ICO under the Data Protection Act 2018 (DPA 2018). The ICO is required under the DPA 2018 to take the Code into account when considering if an online service has complied with its obligations under the GDPR and Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR).
  • Once the Code has been approved by Parliament, organisations will have 12 months to update their practices before the Code comes into full effect. The ICO expects this to be by autumn 2021. However, organisations will need to use that time to review their current practices and make the necessary changes.

What types of services does the Code apply to?

  • The Code applies to “information society services” that are “likely” to be accessed by children. A child for the purposes of the Code is anyone under the age of 18, which follows the approach in the United Nations Convention on the Rights of the Child.
  • If a service is designed for and aimed at children under the age of 18 the Code will apply. However, the Code is much wider and applies to services that are not specifically aimed or targeted at children, but are “likely” to be used by children.
  • The ICO’s view is that a service is likely to be used by a child if it is more probable than not that children will access or use the service. However, this does not mean all services that children could possibly access are covered. The factors taken into account include the nature and content of the service, whether it has particular appeal for children, the way the service is accessed and any measures that are in place to prevent children from accessing the service.
  • The Code applies if the online service is based in the UK, and to organisations based outside of the EEA if they offer services to users in the UK or monitor their behaviour in the UK.

What practical steps should organisations take?

  • As a starting point, it will be necessary to assess whether the Code applies to your existing services. If you decide the Code does not apply, the ICO expects that organisations document the reasons for this decision.
  • If the Code does apply, the ICO expects organisations to be prepared to demonstrate how they comply with the Code.
  • It will therefore be important to document how you have complied in practice with the requirements of the Code, and be able to provide the ICO with copies of relevant DPIAs, policies, training and records of processing if requested.
  • Some practical steps to prepare include:
    • Updating DPIA templates to include elements demonstrating how the requirements of the Code have been met, as well as conducting/updating DPIAs on existing services and consulting with children/parents where necessary;
    • Reviewing existing/introducing new age verification mechanisms to your service where necessary;
    • Reviewing/creating new information and resources for child users appropriate for their age;
    • Ensuring age-appropriate tools are in place for children to exercise their rights under data protection laws;
    • Reviewing existing services and ensuring design changes are made where necessary in light of the Code, including default privacy settings, profiling, nudge techniques, just in time notices etc.

What are the standards?

The Code includes 15 standards of age-appropriate design, which are summarised below.

1. Best interests of the child

  • The best interests of the child is the primary consideration when designing/developing an online service likely to be accessed by a child.
  • The principle of the best interests of the child is something that needs to be considered when designing online services and is a recurring theme throughout the Code.
  • It requires considering the needs of child users, and determining how those needs can be best supported through the way the service is designed to process their personal data.
  • Organisations can still pursue their own commercial or other interests, although where there is a conflict the ICO’s view is that it is unlikely the commercial interests of an organisation will outweigh a child’s right to privac

2. Data Protection Impact Assessments (DPIA)

  • If an organisation is offering an online service likely to be accessed by children, a DPIA is likely to be required. The risks to children which arise from the data processing should be assessed and mitigated.
  • Although a DPIA is likely to be required under the GDPR, organisations must assess and document their compliance with the Code and additional elements should be built into each stage of the DPIA, not bolted on at the end. The DPIA should include an explanation of how each of the standards of the Code have been complied with.
  • The ICO expects larger organisations to conduct some form of consultation with children and parents in most cases. If the organisation considers it is not possible to consult or it is unnecessary or wholly disproportionate, that decision should be recorded in the DPIA. However, the ICO’s view is that it is usually possible to conduct some form of market research or user feedback.

3. Age-appropriate application

  • A risk based approach should be taken to recognise the age of the user and apply the standards of the Code effectively to children.
  • The age range of the user should be established with a level of certainty appropriate to the risks to the child that arise from the data processing, so the protections and safeguards are tailored to the age of the child. If this isn’t possible or an organisation does not want to do this, the Code states you should:
    • (a) reduce the risks to personal data inherent in the service;
    • (b) put in place additional measures to increase the level of confidence in the age of the user; or
    • (c) apply the standards of the Code to all users, regardless of whether they have self-declared as an adult or child.
  • The Code does not mandate particular age verification methods, but sets out a non-exhaustive list of methods to consider, which include:
    • (a) self-declaration (stating age but no evidence to confirm), which might be appropriate for low risk processing or in combination with other techniques;
    • (b) artificial intelligence (estimating age of the user using AI to analyse how the user interacts with the service);
    • (c) third party age verification services, although it is important to carry out due diligence to be comfortable with the level of certainty the service provides regarding the confirmed age of the user, and that it complies with data protection laws;
    • (d) account holder confirmation (e.g. for logged in or subscription based services);
    • (e) technical measures to strengthen self-declaration of age (e.g. preventing users from immediately submitting a new age if they are denied access after first self-declaring); and
    • (f) hard identifiers / formal identity documents such as a passport. Generally this should not be the only option unless the risks in the processing warrant such an approach.
  • The Code states that personal data obtained to verify age cannot be used for other purposes, such as targeting children with advertising or sending details of “birthday offers”; consent would be needed for this type of profiling.

4. Transparency

  • The privacy information provided to users should be prominent, concise and in clear language that is appropriate to the age of the child.
  • Children should be provided with “bite-sized” explanations at the point that use of personal data is activated.
  • Child-friendly explanations should be provided alongside terms, policies and community standards.
  • Information should be presented in a child friendly way which is likely to appeal to the age of the child accessing the service, which may include diagrams, cartoons, graphics, video/audio, gamified or interactive content.
  • The Code also mentions use of privacy dashboards, layered information, icons and symbols to present information in a child friendly way.
  • Information should be tailored to the age of the child. The age ranges set out in the Code are: 0-5 years (pre-literate and early literacy), 6 to 9 years (core primary school years), 10 to 12 (transition years), 13 to 15 (early teens) and 16 to 17 (approaching adulthood), and examples are provided for each age range.
  • The Code mentions that if user testing is not conducted, organisations should document the reason why in the DPIA.

5. Detrimental use of data.

  • Children’s personal data should not be used in a way that has been shown to be detrimental to their well-being, or goes against industry codes of practice, Government advice or other regulatory provisions.
  • The Code requires a precautionary approach: children’s personal data should also not be processed in ways that have been formally identified as requiring further research or evidence to determine whether they are detrimental to the health and well-being of children.
  • As an example, in light of precautionary advice from the UK Chief Medical Officers, features that use personal data to make it difficult for children to disengage with a service are unlikely to comply with the fairness principle under the GDPR (e.g. features that incentivise children to stay engaged such as in-game advantages or automatically extending use instead of requiring an active choice to continue).

6. Policies and community standards

  • Organisations are expected to uphold their own published terms, policies and community standards. This includes privacy policies, age restriction, content policies and behaviour rules/community guidelines. In summary, this means saying what you do, and doing what you say.
  • For example, if you say that you actively monitor user behaviour then you need to do so. In addition, if you rely on “back end” processes such as user reporting to identify breaches of user policies and standards, this should be made very clear. It is unlikely that adopting a light touch or back end only process to monitor and enforce user policies and standards would be sufficient if there are high risks to children.

7. Default settings

  • The default settings should be “high privacy”, unless there is a compelling reason which can be demonstrated for a different default setting, taking into account the best interests of the child.
  • Unless the default settings are changed, use of children’s personal data should be limited to what is essential to provide the service.
  • Privacy settings should be provided for any processing of children’s personal data for additional or optional elements of the service that go beyond the core service (e.g. personalisation). The ICO has stated it will look very carefully at claims that a privacy setting cannot be provided to children because the data is needed to provide the core service.
  • Age-appropriate prompts should be provided at points where a child attempts to change a privacy setting, in order to mitigate risk. Children should not be “nudged” towards selecting a lower privacy setting.

8. Data minimisation

  • Only the minimum amount of personal data should be collected and retained in order to provide the element(s) of the service that the child is actively and knowingly engaged in. It will be necessary to differentiate between each element of the service and consider what personal data is needed and how long it is needed for.
  • Children should be given separate choices over which elements they wish to activate. The Code states that this is particularly important for processing personal data to enhance, improve or personalise the user experience beyond providing the core service. Processing children’s personal data for providing the core service and improvements/personalisation should not be bundled together.

9. Data sharing.

  • Children’s personal data (including inferred or derived data) should not be disclosed unless there is a compelling reason that can be demonstrated for doing so, taking into account the best interests of the child. For example, it is unlikely that selling children’s personal data for commercial re-use would be a compelling reason.
  • Due diligence regarding data protection practices should be conducted on a potential recipient of children’s personal data before sharing such data, to avoid sharing personal data with a third party that may use the personal data in ways that have been shown to be detrimental to the well-being of children.

10. Geolocation

  • Options for collecting geolocation data should be turned off by default, unless there is a compelling reason which can be demonstrated for geolocation to be switched on by default, taking into account the best interest of the child. Any geolocation services that are additional to the core service should be subject to a separate privacy setting.
  • When location tracking is active, an obvious sign should be provided to the child. Options that make a child’s location visible to others must revert to “off” by default at the end of each session.

11. Parental controls

  • If parental controls are provided, the child must be given age-appropriate information about this. The information should be tailored to the age range of the child, and could include audio/video materials to provide this information to children, as well as resources for parents to explain the service and privacy issues to their children.
  • An obvious sign must be provided to children when they are being monitored if the service allows the parent or carer to monitor the child’s behaviour online or track their location.

12. Profiling

  • Options which use profiling should be switched “off” by default, unless there is a compelling reason which can be demonstrated for profiling to be on by default, taking into account the best interests of the child. If profiling is essential to the core service, a privacy setting is not required, although this is interpreted narrowly as the profiling must be “completely intrinsic to the service”. Therefore, most profiling should be subject to a privacy setting.
  • Privacy settings should always be provided for behavioural advertising used to fund the service, where it is not part of the core service the child wants to access.
  • Separate privacy settings should be provided for each different type of profiling, and catch-all purposes such as “providing a personalised service” are in the ICO’s view not specific enough.
  • Profiling should only occur if there are appropriate measures in place to protect the child from any harmful effects (in particular content that is detrimental to health or wellbeing). For example, the Code states that profiling such as content feeds which take children away from the original area of interest to content that is less suitable for children, raise particular concerns.

13. Nudge techniques

  • Nudge techniques (features that encourage or lead users to follow a preferred path) should not be used to encourage children to provide unnecessary personal data or weaken or turn off their privacy protections. The Code states that you should not exploit unconscious psychological processes or use nudge techniques that might lead a child to lie about their age.
  • However, pro-active nudges that support the development needs, health and wellbeing of the child are encouraged.

14. Connected toys and devices

  • Connected toys or devices must include effective tools to enable compliance with the Code.
  • This applies to devices obviously intended for children such as connected toys, but is much wider and applies to any connected device likely to be used by multiple users of different ages including children, e.g. “home hub” interactive speakers.
  • In practice, the Code recommends ensuring that by default the service is suitable for use by children, and providing user profile options for regular users to support use by adults or tailor the service to the age of the particular child.
  • Just in time information should be provided, which could for example be auto-play audio messages, ensuring default settings can only be changed via a support app, or interactive auto bot “conversations” with the user.
  • Devices with a standby or listening mode should include a clear indication that the listening mode is active and personal data should not be collected. These listening mode features should be easy to switch off on the device or via online options, so the toy/device can be used as a non-connected device if this is practicable.

15. Online tools.

  • Prominent and accessible tools should be provided to help children exercise their data protection rights and report concerns.
  • The tools should be appropriate and tailored to the age of the child, and the Code provides examples such as use of icons or audio prompts.

What are the consequences of non-compliance?

  • In terms of enforcement, if the ICO finds there has been a breach of PECR or the GDPR, it can use its powers to take regulatory enforcement action. The ICO has stated that where it seems that harm or potential harm to children it is likely it will take more severe action against a company than it otherwise would for other types of personal data.
  • If the ICO decides that an organisation has not complied with the standards in the Code, the ICO is more likely to allow the organisation time to make changes to the relevant service to comply if the organisation has a well-documented and reasoned case which supports the approach the organisation has taken.
  • However, the ICO is more likely to take formal enforcement action if proper steps have not been taken to comply with the Code and there is clear evidence or constructive knowledge that children are likely to access the service, and clear evidence of a significant risk from the use of children’s data.
Explore More Insight