Could plans to introduce a backdoor into iCloud encryption be a step towards mass surveillance?

In this blog, Dr Mahdi Aiash, an Associate Professor of Computer Science and Cyber Security, looks at the potential impact of Apple introducing a backdoor into iCloud encryption.
A recent report claims that the UK government is pressuring Apple to introduce a backdoor into iCloud encryption, arguing that end-to-end encryption makes criminal investigations harder. But here’s the catch— there’s no official confirmation from the UK government on this demand. While law enforcement agencies have long voiced concerns about encryption blocking access to vital evidence, it’s crucial to approach these claims with healthy scepticism until more details emerge.
This debate isn’t new. In 2023, I discussed these very concerns in an interview on encryption and cybersecurity. Back then, the UK’s focus was on weakening encryption in messaging apps like WhatsApp and Signal under the Online Safety Act. My stance remains the same—it’s not about choosing between privacy and security; both are essential. Governments need tools to fight crime, but encryption plays a critical role in protecting users from cyber threats, identity theft, and surveillance.
Now, the conversation has shifted. Instead of focusing on real-time messaging, the reported demand is for a backdoor into iCloud storage, which Apple recently made fully end-to-end encrypted. This means not even Apple can access stored user data—a major shift in digital security.
If these reports are accurate, this demand goes far beyond accessing messages—this is about potential access to a person’s entire cloud-stored data, including messages, photos, and sensitive documents. That raises a pressing question:
Is this about protecting public safety, or is it a step toward mass surveillance?
Encryption is a double-edged sword. It secures our personal data, protects businesses from cyber threats, and upholds privacy. But it also creates blind spots for law enforcement, making it harder to track criminals who exploit it to stay under the radar.
Some high-profile cases highlight this challenge:
Law enforcement’s stance is simple: If encryption blocks access to crucial evidence, how can they protect the public?
While the UK and other Western democracies have legal safeguards, the same cannot be said for authoritarian governments. In countries with weaker civil liberties protections, encryption backdoors could easily be used for mass surveillance, political repression, and silencing dissent. This isn’t just about crime prevention anymore; it’s about trust in governments and tech companies.
At the same time, law enforcement concerns are real but creating a backdoor in encrypted systems is a risky move. Here’s why:
At the same time, restricting encryption too much can backfire, pushing criminals to become more creative. History shows that when authorities introduce tighter controls, criminals find new ways to evade them:
Forcing mainstream platforms to introduce backdoors may not be as effective as intended. While it could provide short-term access to data, determined criminals will likely shift to more secure and decentralized alternatives, making long-term investigations even more complex. This raises a key concern: How can law enforcement balance security and privacy without driving criminals deeper underground?
Beyond the government access debate, encryption also impacts AI-driven safety tools used by tech platforms:
There’s no simple answer to this debate. Some approaches try to find a middle ground between security and privacy, but each has its own drawbacks:
Each option balances privacy and security differently, but none offer a perfect solution. The real challenge is finding scalable, legally sound methods that don’t compromise global cybersecurity

Dr Mahdi Aiash is an Associate Professor of Computer Science and Cybersecurity at Middlesex University, where he leads the Cybersecurity Research Group. He also serves as a Director of CATS², spearheading research and policy development at the intersection of cybersecurity, technology safety, and societal impact.
His expertise spans AI-driven cybersecurity, exploitation research, and emerging threats. With extensive industry experience, Mahdi actively collaborates with businesses to bridge the gap between academia and real-world cybersecurity challenges. He also sits on multiple advisory panels in cybersecurity, contributing to policy discussions and shaping best practices. His work has been featured in national media as a leading expert in AI and cybersecurity.
Tags: AI, Encryption, government, human rights, iCloud
Leave a Reply