In brief

The UAE has taken a significant step toward safeguarding children online with the introduction of Federal Decree-Law No. 26/2025 On Child Digital Safety ("CDS Law"), which came into force on 1 January 2026. In-scope entities have a one-year grace period to achieve compliance before the CDS Law will become fully enforceable in January 2027. The CDS Law establishes a broad, high-level framework for protecting children in the digital environment, signaling a new era of accountability for businesses offering digital services in the UAE. In particular, it aims to protect children from harmful digital content, strengthen privacy and data protection, standardize age-verification and establish governance mechanisms for proactive monitoring.

Key takeaways

While the CDS Law currently sets out foundational principles, detailed compliance requirements will follow in implementing regulations. In the absence of such further details, this article sets out our key takeaways on the framework established by the CDS Law.

Scope

The CDS Law applies to:

  • Platforms: Any digital platform or entity operating within the UAE, and those operating outside the UAE if their services are directed at UAE users. The CDS Law does not define “directed”, but we expect that this will be further clarified in the implementing regulations. “Platforms” is broadly defined and includes social media platforms, streaming services, e-commerce platforms, gaming platforms, smart applications, search engines and websites.
  • Internet service providers (ISPs): ISPs licensed under the UAE Telecom Law – the only providers currently licensed are du and Etisalat (e&).
  • Custodians of children: Certain obligations apply to parents/guardians of children (i.e., those under the age of 18), though it remains to be seen how such obligations will be enforced in practice.

 

Risk classification

A risk classification system will be issued for digital platforms and will include criteria for classifying platforms according to their type, content, scale of use and impact, and detailed obligations and safeguards shall apply accordingly. The system will also impose certain controls and restrictions depending on the age groups accessing the platform. Until then, businesses should anticipate stricter requirements for platforms with high child engagement.

Key obligations for platforms

While the CDS Law currently sets out principles rather than exhaustive rules, there are a range of obligations on platforms that we expect to be clarified with further detail in the implementing regulations:

  • Gambling and commercial gaming: Platforms are prohibited from allowing children to access online commercial games (including via advertising and promotion). Both digital platforms and ISPs must adopt the necessary technical measures to ensure compliance with this prohibition.
  • Child data privacy and protection: Platforms must not process the personal data of children under the age of 13 unless explicit custodian consent is obtained, data usage is explained to the child and custodian via a privacy policy, access is strictly limited, and data is not used for targeted advertising or commercial profiling. Platforms used for educational or health purposes are exempt from this prohibition where Cabinet approval is obtained.
  • Age verification: Platforms must implement certain age verification mechanisms. The nature of the mechanism(s) to be implemented will depend on the platform’s risk classification, and the impact of the platform’s content on children.
  • Content moderation, advertising: Platforms must implement blocking and filtering tools, content classification measures and controls on targeted advertising.
  • Custodian controls: Platforms must provide custodians with the means to set daily time limits for children’s use of the platform, manage accounts and monitor usage.
  • Notice and takedown: Platforms must implement user-friendly notice and takedown mechanisms to allow users to report child sexual abuse material (CSAM) and other harmful content.
  • Reporting: Platforms must disclose their policies around user engagement and content, and provide periodic reports to the competent authorities regarding measures taken to protect children from harmful digital content, and compliance with obligations under the CDS Law. Platforms must also report any CSAM or harmful content to the competent authorities, as well as information regarding the people/platforms involved.

 

Key obligations for ISPs

  • Content moderation: ISPs must implement content filtering systems to align with policies prohibiting harmful content.
  • Custodian controls: ISPs must also adopt measures to ensure safe use of services, including requiring the custodian to sign the terms of service. ISPs must provide tools that allow custodians to monitor and supervise the digital content accessible by children.
  • Reporting: ISPs must also report any CSAM or harmful content to the competent authorities, as well as information regarding the people/platforms involved.

 

Key obligations for custodians

As mentioned above, custodians are subject to a number of obligations under the CDS Law, including obligations to monitor their children’s digital activities and use parental control tools, refrain from allowing children to access content that is inappropriate for their age, refrain from exposing/exploiting children in ways that threaten their safety, privacy and dignity, and report any CSAM or harmful content to which their children are exposed to the competent authorities. We expect the implementing regulations will provide further clarity on the nature of such obligations in due course.

Governance and enforcement

The Telecommunications and Digital Government Regulatory Authority (TDRA) will oversee compliance and enforcement of the CDS Law with the support of a Child Digital Safety Council, chaired by the Minister of Family, tasked with coordinating national efforts on child digital safety. Platforms are expected to cooperate with the TDRA and the other competent authorities, participate in policy development, and support awareness initiatives.

Non-compliance may result in blocking, closure, or other administrative sanctions, with detailed penalties to be outlined in further legislation (it is envisaged that a separate regulation on administrative sanctions will be issued).

Next steps

Businesses should begin familiarizing themselves with the new framework and considering how child-safety measures may need to be integrated into existing governance and compliance programs. However, the forthcoming platform classification system will be critical, as this will ultimately determine the extent and proportionality of obligations for different services. Companies should therefore closely monitoring the issuance of the implementing regulations (including upcoming classification system), which are expected to provide more concrete guidance on compliance expectations.

To speak to us or for any assistance in relation to any data and technology-related matters, or issues generally, please feel free to contact one of the Baker McKenzie team members listed above.

Explore More Insight