What happened, who was affected, and what we can learn…
February 2026 showed a clear shift in regulatory focus; children’s data is no longer treated as a niche privacy issue, it is now one of the ICO’s most visible enforcement priorities.
This month was shaped by major action against online platforms that allowed children to access services without meaningful age verification or adequate safeguards around how their personal data was collected and used. These were not traditional “data breaches” in the usual ransomware or hacking sense, but they represent something equally serious; organisations processing children’s personal information without a lawful basis or appropriate protection.
The message from the ICO is becoming increasingly direct; if your platform can be accessed by children, you are expected to know that, design for it and be able to prove you took reasonable steps to protect them.
At the same time, the practical effect of the Data (Use and Access) Act 2025 is beginning to show, with stronger enforcement tools and a wider expectation of organisational accountability.
February’s lesson was simple; when it comes to children’s data, “we didn’t know” is no longer a defence.
As always, this report brings together ICO enforcement activity, public disclosures and credible reporting involving breaches affecting people in the UK.
Only incidents involving UK individuals and confirmed ICO oversight are included.
MediaLab (Imgur)
Date Reported: 5th February, 2026
No. of UK Individuals Affected: Not publicly disclosed.
Data Exposed or at Risk: Children’s personal account data, user information collected through platform access and account creation.
ICO Response: Monetary penalty issued for unlawful processing of children’s data and failure to implement appropriate age assurance measures.
Summary: The ICO concluded that MediaLab, owner of Imgur, had failed to implement effective safeguards to prevent children from using the platform unlawfully. The regulator found that children’s personal data had been processed without sufficient age verification or appropriate controls under the Age-Appropriate Design Code.
Commentary: This was not a cyberattack, but the regulatory consequences are just as serious; if children can use your service, you are responsible for how their data is handled. Relying on self-declared ages or weak sign-up barriers is no longer seen as reasonable; it is increasingly viewed as negligence.
Reddit
Date Reported: 24th February, 2026
No. of UK Individuals Affected: Not publicly disclosed.
Data Exposed or at Risk: Personal data of children under 13 processed without a lawful basis.
ICO Response: Approximate £14.5 million monetary penalty issued for failures in age verification and unlawful processing involving underage users.
Summary: The ICO found that Reddit had allowed children under 13 to create accounts and have their personal data processed without sufficient safeguards. The platform relied heavily on self-declared ages without meaningful verification, which the regulator considered inadequate under UK GDPR and the Children’s Code.
Commentary: This was one of the strongest regulatory statements we have seen around children’s privacy. A fine of this size makes the point very clearly; asking a user how old they are and taking the answer at face value is not a serious compliance strategy.
Insights for UK Organisations
- Children’s privacy is now a frontline enforcement issue. The Children’s Code is no longer treated as background guidance; it is being actively enforced with major financial penalties.
- Age verification must be credible; self-declaration without real checks is quickly becoming one of the weakest justifications an organisation can offer to the ICO.
- The ICO is moving towards thematic enforcement. February’s cases show a focus on patterns of behaviour across entire sectors, not just isolated failures.
- If children can access your platform, your compliance obligations increase immediately; businesses cannot rely on adult-only intentions where real-world usage says otherwise.
Legislative Context
February 2026 sat within the early practical impact of the Data (Use and Access) Act 2025, with several provisions now strengthening how the ICO investigates and enforces compliance.
This includes:
- Stronger investigatory powers.
- The ability to require independent compliance reporting.
- Wider scrutiny of governance and risk decisions.
- Greater focus on accountability before an incident occurs, not just after.
Alongside this, the Age-Appropriate Design Code continues to shape expectations for any online platform that children may reasonably access, whether or not children are the intended audience.
The regulatory position is becoming much clearer; organisations must be able to demonstrate privacy by design, especially where younger users are involved.
Conclusion
February 2026 was not defined by ransomware headlines or stolen databases; it was defined by something more fundamental, whether organisations had the right to process children’s personal data at all.
Both MediaLab and Reddit reflected the same underlying problem; platforms designed around growth and engagement first, with privacy controls added later, if at all.
That approach is becoming increasingly difficult to defend.
For organisations operating digital services in the UK, especially those accessible to younger users, compliance now means:
- Meaningful age assurance.
- Stronger lawful basis decisions.
- Privacy built into product design from the start.
- The ability to explain those decisions clearly to the regulator.
The ICO is making one thing very clear; children’s data is not just another compliance issue, it is one of the clearest tests of whether an organisation takes data protection seriously at all.
Disclaimer
This report is based on public disclosures, media reports, and ICO updates available at the time of writing. Figures for affected individuals may be estimated where not officially disclosed. This post is intended for informational purposes only and does not constitute legal advice.
Sources:
- Imgur Owner MediaLab Fined Over Children’s Privacy Failures
- Reddit Enforcement Action
- Reddit Fined $20 Million in UK Over Children’s Data Failures
- Imgur is Blocking Users in the UK
- Reddit Hit with $20 million UK Data Privacy Fine Over Child Safety Failings
- Reddit Issued with £14.47m Fine for Children’s Privacy Failures
- Reddit Fined £14.5m in UK Over Use of Under-13s’ Data
