Apple Just Made Every iPhone in the UK a Child’s Device Until You Prove Otherwise

Yesterday, Apple dropped iOS 26.4. If you’re in the UK and you’ve updated, you’ve probably already seen it: a new screen asking you to confirm you’re over 18. If you don’t, your iPhone flips on web content filters and Communication Safety features automatically. Your £1,200 phone becomes, in effect, a child’s device.

Let me say upfront: I am completely in favour of protecting children online. Completely. The stats are grim. Ofcom’s own research shows children as young as eight are accessing pornography. That’s not acceptable. Something has to change.

But the way this has been rolled out raises real questions, and I think it’s worth talking about what actually happened rather than just the headline.

What Apple Has Actually Done

When you update to iOS 26.4, Apple checks whether it can confirm you’re an adult. If you’ve had an Apple Account for long enough and you’ve got a credit card on file, it might do this automatically. You may not even notice. For those people, this is seamless. Apple’s own support page explains that existing payment methods and account history can be used to verify your age without any extra steps.

But if Apple can’t confirm it automatically, you need to do it yourself. Your options are: link a credit card (not a debit card, which is what most people in the UK actually use day to day), or scan a driving licence or national ID. UK passports are not accepted unless you’ve created a Digital ID in Apple Wallet using a US passport, which obviously rules out most people in this country.

The Reddit thread about this is something else. People reporting ten or more attempts to get their driving licence scanned. Failures with provisional licences. Failures with passports. One person noted you need to photograph your licence against a dark, matte background with no shadows or glare and a very steady hand. That’s not exactly the Apple experience most people signed up for.

And if you don’t verify at all? Web content filters go on. Communication Safety features that scan for nudity in Messages get activated. You lose the ability to change your content restrictions. Your phone is now locked down until you prove who you are.

The Online Safety Act Context

This is all happening under the umbrella of the UK’s Online Safety Act, which came into force in July 2025. The Act requires platforms to implement age verification so children can’t access harmful content. That includes pornography, but also content related to self-harm, suicide, and eating disorders.

The interesting wrinkle is that Apple’s App Store and iOS are not actually covered by the Online Safety Act. Ofcom confirmed this. Apple has gone further than the law currently requires, and Ofcom has welcomed it. In a statement, the regulator called it a “real win for children and families” and said the UK would be one of the first countries in the world to receive these protections at the device level.

So Apple isn’t being forced to do this. They’re choosing to do it. That’s worth noting, because it changes the conversation. This isn’t just compliance. This is Apple making a strategic decision about what role the operating system should play in age verification, and the UK is the testing ground.

The Privacy Problem

Here’s where it gets uncomfortable. Age verification only works if you hand over something that proves who you are. A credit card. A driving licence. A national ID. That’s personal data, and once it’s in the system, you have to trust that it stays safe.

Apple says your credit card or ID document isn’t stored unless you choose to save it for other purposes. That’s reassuring. But the broader pattern here is what concerns me. When the Online Safety Act kicked in last July, VPN downloads in the UK went through the roof. Proton VPN reported a 1,400% surge in sign-ups within minutes. NordVPN saw a 1,000% increase. Five of the top ten free apps on the UK App Store were VPNs within days of the law coming into force. Over 550,000 people signed a parliamentary petition calling for the Act to be repealed. It was debated in December 2025.

That tells you something. A significant chunk of the UK population doesn’t trust the system. Not because they’re trying to access anything illegal, but because they fundamentally object to handing over personal identification to use the internet. And there’s a reasonable argument behind that objection. When you create a centralised record of who has verified their age and where, you create a target. The Electronic Frontier Foundation, Open Rights Group, Big Brother Watch, and Index on Censorship have all raised exactly this concern.

Big Brother Watch’s director Silkie Carlo called Apple’s update “more like ransomware” and said it puts a “chokehold on Britons’ freedom to search the internet.” That’s strong language. But when you look at what happens if you don’t verify, a restricted device that you’ve already paid for, you can see where she’s coming from.

The Bit Nobody’s Talking About

Here’s the thing that I haven’t seen enough people discuss: this doesn’t just affect Apple. The Online Safety Act applies across the board. Social media platforms, search engines, gaming sites, forums. The age verification requirement on websites has already caused smaller platforms to shut down because they simply can’t afford to implement compliant systems. Hobby forums about trains, football, video games, they’re closing because the compliance burden is too heavy for volunteer-run communities.

And before anyone turns this into an Apple-bashing exercise, Google are on the same road. They’ve already rolled out age verification on the Play Store, requiring users to prove they’re 18 or older before downloading certain apps. The options are similar: upload an ID, take a selfie, use a credit card, or go through a third-party service. Google have also added age checks to YouTube and Search. The difference right now is that Google haven’t locked down the entire operating system the way Apple have. Android isn’t defaulting you to a child’s experience if you don’t verify. But given the pressure from the Online Safety Act and similar laws rolling out in the US, it would be naive to think that’s not coming. This is an industry-wide shift, not a single company’s decision.

Apple, to their credit, have probably built the most frictionless version of age verification we’ve seen so far. If your account is old enough or you’ve got a credit card on file, you might sail through it. But the direction of travel is what worries me. Today it’s Apple. Tomorrow it could be every device, every app, every website. And while the intention is good, keeping children safe, the execution keeps creating side effects that nobody seems to have thought through properly.

Where I Actually Land on This

I don’t think this is as simple as “Apple bad” or “government overreach.” It’s messier than that.

Protecting children online is not optional. The content that’s freely accessible to young people right now is genuinely harmful, and anyone who dismisses that hasn’t looked at the evidence. Something needs to be done.

But “something” shouldn’t mean “everything.” The approach so far has been to build bigger and bigger gates that everyone has to walk through, and then act surprised when people find ways around them or refuse to comply on principle. A VPN costs a few pounds a month. A technically confident 14-year-old can set one up in minutes. The people most affected by these restrictions are not the people the restrictions are designed to protect. They’re ordinary adults who now have to jump through hoops to use a device they own.

What I’d like to see is a more serious conversation about where the responsibility actually sits. Parental controls already exist. Apple’s Screen Time and Family Sharing tools are genuinely good. Education about digital safety is improving, slowly. But slapping an age gate on the entire operating system and defaulting to restricted mode for anyone who doesn’t comply? That feels like the blunt end of the hammer.

I’ll keep watching this one closely. If you’ve updated to iOS 26.4 and you’re struggling with the verification process, the most reliable route seems to be a UK driving licence photographed against a plain dark background in good lighting. Give it a few attempts. And if you don’t have a driving licence or a credit card? Right now, Apple hasn’t given you many options. That’s a problem they need to fix.


Scott Quilter | Co-Founder & Chief AI & Innovation Officer, Techosaurus LTD