Federal Decree-Law No. 26 of 2025 establishes the UAE's first dedicated regime for protecting children in digital environments. The law operates on two parallel tracks — distinct obligations for digital platforms (gaming, social media, live-streaming, e-commerce) and for internet service providers. The strictest provisions target children under 13, where verifiable parental consent and a near-total prohibition on data collection apply. The law is in force from 1 January 2026, with full compliance required by 1 January 2027.
The compliance window is short. The law took effect 1 January 2026. Affected entities have one year to come into compliance — meaning operational obligations land 1 January 2027, with implementing regulations and the Cabinet platform classification expected during the grace period. Practitioners should treat the year as build time, not headroom.
The Child Digital Safety Law is structurally bifurcated. Digital platforms — gaming, social media, live-streaming, e-commerce, and similar consumer-facing digital services — carry one set of obligations focused on age verification, content moderation, advertising restrictions and data protection for children. ISPs carry a parallel but narrower set focused on network-level filtering and infrastructure-level safe-access measures. Most groups operating in or into the UAE need to map their entities to one or both tracks.
Gaming, social media, live-streaming, e-commerce, and other consumer-facing digital services. Obligations are user-facing, age-aware, and content-aware. Platform classification by Cabinet decision will determine the proportionality of each obligation to platform risk profile.
Network-level obligations on entities licensed by the TDRA to provide internet access in the UAE. The ISP track is narrower than the platform track — focused on infrastructure-level filtering and supervised-access support — but applies independently of any platform-side compliance.
A single corporate group can sit on both tracks. A telco offering broadband (ISP track) plus a content streaming service (platform track) carries dual obligations under the same Decree-Law, with separate compliance discipline for each. Many Gulf and regional telecoms groups will fall into this category by virtue of vertical integration.
The Child Digital Safety Law has explicit extra-territorial reach. It applies to digital platforms and ISPs operating in the UAE, and equally to those targeting users in the UAE — even with no UAE legal presence. Foreign platforms with material UAE user bases are squarely in scope, and the Implementing Regulations will set the basis for enforcement actions including service blocking, suspension, or closure.
Any digital platform — gaming, social media, live-streaming, e-commerce, content sharing, similar — established in the UAE and accessible to or used by children. Carries the full digital platform obligations track. Subject to forthcoming Cabinet platform classification system.
CDS Law · Track 01 obligationsAny ISP licensed by the TDRA to provide internet access services in the UAE. Carries the ISP obligations track — network-level filtering, supervised-access support, parental control infrastructure, regulator cooperation.
CDS Law · Track 02 obligationsPlatforms with no UAE legal presence but with a material UAE user base or with content directed at UAE users. Squarely in scope. Combination of extra-territorial reach, mandatory cooperation, and service-blocking enforcement creates significant exposure even without UAE establishment.
CDS Law · Extra-territorial reachGroups operating both ISP services and consumer-facing digital platforms — common in Gulf and regional telcos. Both obligation tracks apply; compliance must be designed independently per track and reconciled at the group level. Common scenario for major regional groups.
Both tracks · Group-level complianceWhile the law applies to all under-18s, the strictest provisions kick in at age 13. Below 13, the regime moves from "protective controls" to "near-total restriction" on platform data handling. Verifiable parental consent becomes the only viable basis for processing; commercialisation of children's data is prohibited; and the obligations apply directly to digital platforms regardless of jurisdiction. This is the most demanding provision in the entire UAE privacy framework — stricter than anything in Federal PDPL, DIFC or ADGM.
Four interlocking restrictions. The age-13 boundary is the operational threshold every digital platform with UAE users needs to identify, verify, and manage as a hard cut-off in product flows.
Unlike the Federal PDPL — which sits with the UAE Data Office — the Child Digital Safety Law distributes regulatory authority across multiple bodies. The TDRA holds authority over ISPs and standards-setting; competent authorities (typically media, child protection and cybersecurity bodies) supervise platforms in their respective remits; and the Child Digital Safety Council coordinates the overall framework. Practitioners should expect compliance interactions with multiple authorities, not one.
Telecommunications and Digital Government Regulatory Authority. Issues policies and standards for ISP compliance under the law. Supervises ISP-track obligations — network filtering, parental control infrastructure, supervised-access measures. The primary regulator for the ISP track.
Authorities responsible for child affairs, media, and cybersecurity. Each supervises platform obligations within their respective remits. A gaming platform might engage with media authority; a social-media platform with cybersecurity authority; a financial e-commerce platform with multiple authorities. Allocation of authority is content- and sector-driven.
Chaired by the Minister of Family. Advisory and coordinating body bringing together federal entities, local entities, and the private sector. Proposes strategic policies and initiatives, ensures alignment with international standards. Not a direct enforcement body — sits above the operational regulators in a coordination role.
Federal Decree-Law No. 45 of 2021 — Personal Data Protection Law. The CDS Law's data-protection obligations explicitly align with the broader Federal PDPL framework. Where children's personal data is processed, both laws apply concurrently. The under-13 protections are stricter than anything in Federal PDPL — when both apply, the stricter standard prevails.
Forthcoming Cabinet decision. Will classify digital platforms by risk profile — determining the proportionality of obligations like age verification mechanisms, content moderation depth, and parental control sophistication. Higher-risk platforms (live-streaming, gaming with chat) will carry stronger obligations than lower-risk platforms (static content, e-commerce). Classification system pending publication.
The specific administrative penalty figures are pending the Implementing Regulations expected during 2026. The law itself names the categories of enforcement available — and these are substantial. Service blocking, account suspension, and platform closure are explicitly on the table, alongside whatever financial penalty schedule the Cabinet sets. Plus the Cybercrime Law and PDPL operate in parallel where their respective triggers are met.
| Enforcement category | Source | Practical scope |
|---|---|---|
| Administrative finesTo be set by Implementing Regulations CDS Law | Forthcoming Administrative Penalties Regulation under the CDS Law. | Specific figures pending. Expected to follow the proportionality logic applied across UAE federal regulation — fines calibrated to violation severity, with separate ceilings for individual vs corporate offenders. Multiple-violation aggregation likely. |
| Service blocking / suspensionOperational consequence CDS Law | Explicitly named in the law as available enforcement. | For foreign platforms in particular, the threat of service blocking is the most material enforcement consequence — given the absence of UAE establishment, financial penalty collection is harder, but blocking access at the network level is enforceable through ISPs. Similar to existing UAE blocking authority for prohibited content. |
| Up to AED 5 millionPlus imprisonment Cybercrime Law overlay | Federal Decree-Law No. 34 of 2021 (Cybercrime Law). | Applies in parallel where child-data misuse meets cybercrime triggers — unauthorised access, intentional disclosure, identity-related crimes targeting minors. Imprisonment available for material offences. Operates independently of CDS Law administrative penalties; one incident can trigger both. |
| PDPL administrativePending Federal PDPL Executive Regs Federal PDPL | Federal Decree-Law No. 45 of 2021 — Personal Data Protection Law. | Federal PDPL sanctions operate concurrently for personal-data violations involving children. PDPL Executive Regulations are themselves pending — meaning headline figures still to be set. Where both regimes apply (typical for child personal-data processing), the stricter penalty applies. |
For foreign platforms, the headline risk is service blocking — not financial penalty. The combination of extra-territorial reach, mandatory regulator cooperation, and explicit blocking authority means non-compliance can be enforced at the network level even where financial enforcement is impractical. UAE authorities have a track record of using content-blocking authority against non-compliant foreign services.
The work splits cleanly along the two-track structure. Platform operators carry workstreams W-01 to W-08; ISPs carry workstreams W-09 to W-11; vertically integrated groups carry both. Most workstreams must be largely complete by 1 January 2027 — meaning detailed scoping should already be underway in early 2026 to leave realistic build time.
First-pass determination: are we on the platform track, the ISP track, or both. For platform-track entities, alignment to the forthcoming Cabinet platform classification — risk-tier mapping for proportionality of obligations.
Age verification mechanism design proportionate to risk classification. For high-risk platforms (live-streaming, gaming with chat), robust verification with low circumvention risk. For lower-risk platforms, lighter mechanisms acceptable. Cross-jurisdiction integration where the platform also operates under EU GDPR / UK Children's Code obligations.
Verifiable parental consent mechanism for under-13 data processing. Identity-verification workflow for parent / guardian. Audit trail of consent capture. Refresh / re-verification protocol where consent context changes. Operational hard requirement for any under-13 processing.
Default privacy settings calibrated for child users — minimum data exposure as the starting state. UI flows redesigned where child accounts are auto-detected. Cross-product consistency where multiple services interact.
Content classification system for age-appropriateness. Blocking, filtering and age-rating mechanisms. Reporting tools for harmful content visible to child users. Engagement with TDRA standards on content categorisation.
Restrictions on targeted electronic advertising to children. Profiling and behavioural advertising disabled for child users. Contextual / non-personalised advertising as alternative. Ad-tech vendor stack reviewed against the under-13 prohibition on data commercialisation.
Hard cut-off for child access to online commercial gaming and gambling. Account-creation restriction; ad-promotion restriction; data-based exploitation restriction. For platforms with adjacent gaming features (e.g. social platforms with gaming integrations), boundary management against the prohibition.
Functional parental control mechanisms accessible to parents/guardians. Reporting tools accessible to child users. Cross-platform consistency where the parental-control system operates across multiple services. Documentation and demonstrability of effectiveness.
Network-level content filtering activation. Calibration to TDRA standards. Filter-list management; appeal / unblocking mechanism for false-positive blocks. Compliance with TDRA inspection requirements.
Network-level provisioning of safe-access measures. Parental control infrastructure compatibility. Customer-facing tooling enabling supervised access for child accounts. TDRA-aligned standards adoption.
Engagement with TDRA, competent authorities (media, child affairs, cybersecurity), and the Child Digital Safety Council. Notification protocols, standards-update tracking, regulatory liaison cadence. For multi-track groups, coordinated regulator engagement avoiding duplicate or conflicting positions.
CDS Law work is delivered through one of three engagement shapes. Given the operational depth of the obligations and the 1 January 2027 deadline, most platform-track entities will need a substantive project engagement; ongoing operation post-build typically benefits from a DPOaaS or advisory retainer arrangement.
For platform-track entities (and dual-track groups) needing to build CDS Law compliance from scratch or close material gaps. Track determination, age verification design, parental consent build, privacy-by-default architecture, content filtering, advertising overhaul, commercial gaming cut-off, parental controls. Substantial scope due to the operational depth of obligations.
For platforms triggered into DPO appointment under Federal PDPL Article 10 — typically activated by large-scale processing of children's data which qualifies as both sensitive-data processing and high-risk technology use. Named DPO carrying both PDPL and CDS Law facing responsibilities.
For platforms with in-house privacy and trust & safety capability needing senior backup on harder CDS Law questions — platform classification interpretation, age-verification mechanism defensibility, parental consent edge cases, multi-regulator engagement, multi-jurisdiction overlay with EU / UK / India children's data regimes.
Common questions on the CDS Law during the 2026 grace period — from platforms scoping their compliance build, ISPs aligning to TDRA standards, and group operators figuring out which entities sit on which track.
The 1 January 2027 deadline is closer than the build cycle for the obligations the CDS Law actually requires. Age verification, verifiable parental consent, content classification, advertising overhaul, parental controls — each is multi-month work in itself. A 30-minute scoping call costs nothing — we will tell you honestly which track applies, where the hardest exposures sit (typically under-13), and what the right shape of work looks like before the grace period closes.
Schedule a call