The UK government is reportedly ready to push Apple and Google into creating system-level tools that can detect nudity and prevent it from appearing on a screen unless the user presents evidence that they are an adult, according to a report in the Financial Times. The measures, which would take effect on iOS and Android devices, are intended to prevent people from taking, sharing or viewing explicit pictures without age verification.
Officials are not planning to immediately seek a change in the law on its own, but want the tech giants to voluntarily adopt controls. The Home Office is believed to prefer a regime of on-device nudity scanning and strong age verification, such as biometric determination or certified ID documents, which must be satisfied before explicit material could legally be made or viewed.

What the U.K. Is Proposing for Device-Level Age Checks
Ministers want Apple and Google to include nudity-detection software into their operating systems, so that a default ban on explicit images can only be lifted if users can prove they are over 18. That would apply to camera usage and messaging apps, the photo galleries, and web browsers (but likely not on pages rendered by a third-party app that displays images through system frameworks).
The idea also reflects an emerging policy trend to move age verification out of individual apps and into the hands of device and app-store gatekeepers. The rise of pay-to-play — Meta and others have argued publicly that Apple and Google are the best equipped to do age checks at a wide scale given their control over sign-in systems, app distribution, and even device-level safety.
How It Might Work on iOS and Android Devices
On either side of the divide, both ecosystems are already relying on on-device machine learning to flag sensitive imagery. Apple’s Communication Safety tool can obscure detected nudity in child accounts’ Messages, and a Sensitive Content Warning soft-blocks such media across apps. Android has content filters, app ratings, and SafeSearch-style controls. The British blueprint would go further, making detection proactive and hitching access to a confirmation of adulthood.
In practice, it would mean a user trying to take, open, or send an explicit image would see in-system prompts for age verification — by facial age estimation tools, passport or driving licence checks, or account-level attestations.
Developers may have to start using new OS APIs that will follow the block, with app store policy ensuring compliance. The companies would also be told to continue scanning on the device, rather than off it, in order to minimize the amount of data being shared and lessen privacy risk.
Regulatory Backdrop and the Implications
The push comes as the UK’s Online Safety Act is being enforced, which gives Ofcom the power to demand porn services adopt “very robust” age assurance. Ofcom’s draft codes have outlined expectations for strong checks, and the regulator can impose heavy fines on regulated services that do not comply. If the Act zeroes in on platforms and not device-makers, the government’s gambit is to strengthen protections upstream, at the level of an operating system.

This isn’t the first time Britain has attempted this. There was previously an effort under the Digital Economy Act to require age checks for adult sites, which was scrapped after technical and privacy worries. The new approach will use device-level functionality that has been built over the past few years, as well as resources for parents and privacy settings to assist them to ensure their child’s best interests are met online. The protections already exist in the Children’s Code from the Information Commissioner’s Office, which stipulates services must default to high privacy when children are concerned.
Abroad, age checks for adult sites are a move that the regulator in France has made, and the EU’s Digital Services Act clamps down on platform obligations toward minors. Whereas anything Britain does that delves deep into iOS and Android would have far-reaching consequences beyond the United Kingdom, because of the global nature of app stores and OS updates.
Benefits, Risks and Open Questions for OS-Level Nudity Checks
Longtime child safety campaigners, like the NSPCC, have been warning for years about underage men having access to explicit content and share shocking statistics which indicate that many teenagers come into contact with pornographic images, often by accident. Ofcom’s studies also show big numbers of 13-17-year-olds who have seen porn online, with their first exposure often in the early teenage years. Advocates say OS-level controls could eliminate those holes that app-by-app filters might skip.
Privacy groups argue that scanning devices, even if the scan is performed on the device itself, could contribute to normalizing inspection of content and result in false positives — something they worry would have a chilling effect on lawful expression and resources regarding sexual health and education, or misclassifications of nonsexual images such as breastfeeding or art. Sensitive on-device analysis: Apple had previously abandoned a controversial plan to scan for child sexual abuse material after pushback from security researchers and civil liberties groups.
Accuracy and bias in age estimation are still problematic, with vendors reporting high overall performance but varying error rates for different demographics. And there are also logistical questions: how verified age signals would be synchronized across accounts and devices, how appeals would function, whether there’s a way for adults to opt out. For developers, mandated APIs could also add complexity and increase costs, especially if a service is an encrypted messaging app with no server-side content access.
What Comes Next for Apple, Google, and U.K. Policy
The Home Office is expected to discuss the plan with Apple and Google, while any formal policy would probably be subject to consultation with Ofcom, the Information Commissioner’s Office (ICO), and industry. Neither company has issued a public statement of support for the plan, and both would presumably be initially reluctant to comply (Android OEMs generally do not like OS-level requirements that might introduce privacy or developer ecosystem changes).
Even without immediate legislation, the push raises pressure on gatekeepers as the U.K. plans to roll out its Online Safety Act. If Apple and Google adopt a verified-adult layer for nudity, it could recast billions of users’ default experience — and reshape the global debate over what purview device makers themselves have to police content on their customers’ personal phones.
