Why End-to-End Encryption Is Under Political Pressure Everywhere
by Scott
In January 2025, the British government issued a secret order to Apple demanding that the company provide blanket access to the encrypted cloud data of Apple users, not merely users in the United Kingdom but users anywhere in the world. The order was issued under powers contained in the Investigatory Powers Act, a piece of legislation passed in 2016 that gave British authorities expansive capabilities to compel technology companies to assist with surveillance. The order was kept secret, partly because the law itself contains provisions restricting companies from disclosing that they have received such demands. Apple, rather than comply, chose to withdraw its Advanced Data Protection feature from the United Kingdom entirely, meaning that British users lost the option to enable the highest level of encryption for their iCloud backups, photos, notes, and other data. The company stated it had never built a backdoor into any of its products and never would. The British government then reportedly issued a second order, later in the year, attempting to limit the demand to British users specifically, as if the violation of privacy were less troubling when applied to a smaller population.
The Apple situation is a vivid illustration of a conflict that is simultaneously intensifying and broadening. Governments around the world are pressing, with increasing insistence and increasingly explicit legal instruments, for access to communications and data that end-to-end encryption places beyond their reach. The technology companies that provide encrypted services are caught between the legal demands of the governments in whose jurisdictions they operate and the security commitments they have made to their users. The civil liberties organizations and security researchers who understand the technical realities of encryption are sounding alarms with a frequency and urgency that reflects genuine concern that the current trajectory leads somewhere very bad. And the general public, which benefits from encryption every time it sends a private message, makes a banking transaction, or stores a sensitive document in the cloud, is largely unaware that the technology protecting those activities is under sustained political assault.
To understand why this conflict is so sharp and why it matters so much, it helps to understand what end-to-end encryption actually does and why its properties make it simultaneously valuable to everyone who uses it and threatening to governments that want to surveil communications. End-to-end encryption is a method of securing communications in which the data is encrypted on the sender’s device using a key that only the intended recipient possesses. The encrypted data passes through the networks and servers of the service provider, but those intermediaries have no ability to decrypt it because they do not have the necessary keys. Only the two endpoints of the communication, the sender and the recipient, can read the content. This means that a messaging service like Signal or WhatsApp, when operating with end-to-end encryption enabled, genuinely cannot read its users’ messages even if it wants to, and genuinely cannot hand over the content of those messages to a government agency even if ordered to, because the content is not accessible to the service at all.
This is not a policy choice that companies make. It is a mathematical and engineering reality. The encryption either works or it does not. A company that builds end-to-end encryption correctly has created a system in which it is technically incapable of reading the communications it transmits. This property is enormously valuable for users, because it means their private communications are protected not only from criminals and hackers who might compromise the service provider’s systems, but also from the service provider itself and from any government agency that might compel the provider to disclose data. Journalists communicating with sources, lawyers communicating with clients, doctors communicating with patients, activists communicating with colleagues in repressive environments, and ordinary people communicating with family and friends about personal matters all benefit from the assurance that their communications are genuinely private.
The problem from a law enforcement and intelligence perspective is that the same property that protects innocent users from surveillance also protects criminals and terrorists. When a violent extremist plans an attack using an end-to-end encrypted messaging application, the encrypted messages are beyond the reach of investigators even if they obtain a lawful warrant and even if the messaging company fully cooperates with their request. The company cannot provide what it does not have. Law enforcement agencies have found this reality increasingly frustrating as encryption has become widespread, and they have argued persistently that it constitutes a dangerous and unjustifiable limitation on their ability to investigate serious crimes and prevent terrorist attacks.
This argument has a genuine moral weight that should not be dismissed. The investigation of child sexual abuse material, which has been one of the most frequently cited law enforcement concerns in this debate, involves real children suffering real harm, and the inability to access communications used to organize and distribute such material is a real obstacle to identifying and prosecuting abusers. The investigation of terrorism, organized crime, drug trafficking, and other serious offenses is similarly impeded by communications that cannot be read. Law enforcement officials who make these arguments are not fabricating concerns for political purposes. The concern is real.
The critical technical dispute is whether anything can be done about it without destroying the security that encryption provides for everyone. Governments and some law enforcement agencies have proposed various forms of what are often called exceptional access mechanisms, essentially technical arrangements that would allow law enforcement to access encrypted communications under specified legal conditions. These proposals have taken different forms. Some involve building mathematical backdoors into encryption systems that only authorized parties can use. Others involve requiring service providers to retain copies of encryption keys with a trusted third party that can be compelled to disclose them. Others involve client-side scanning, in which messages are analyzed on the user’s own device before being encrypted and transmitted, so that flagged content can be reported even though the communication itself remains encrypted.
The technical consensus among cryptographers and security researchers is nearly unanimous and has been expressed repeatedly over decades: there is no such thing as an exceptional access mechanism that can be made available to authorized parties without also creating a vulnerability that can be exploited by unauthorized ones. A backdoor built for law enforcement is a backdoor that can be found and used by criminals, foreign intelligence services, and any other sufficiently motivated attacker. The history of technology is full of examples of carefully controlled access mechanisms that were discovered and exploited by parties who were never intended to have access. The notion that a mathematical weakness can be deployed in a targeted and controlled way, available to the right governments and inaccessible to the wrong ones, reflects a misunderstanding of how mathematics and software work.
This was illustrated with particular clarity by the Salt Typhoon hack, a Chinese government cyber operation revealed in late 2024 that compromised the systems of major American telecommunications companies, including AT&T and Verizon. The operation gained access to communications data partly through the lawful intercept systems that telecommunications companies are required by American law to maintain for law enforcement access. These systems, built specifically to allow authorized governmental access to communications, were exploited by Chinese intelligence to conduct surveillance of American officials and politicians. The very infrastructure built to provide controlled access to communications became the attack surface through which an adversary conducted unauthorized access. American cybersecurity officials, in the immediate aftermath of this revelation, publicly recommended that people use end-to-end encrypted messaging applications precisely because the lawful intercept systems in telecommunications networks had proven to be an exploitable vulnerability.

The contradiction embedded in this situation is stark and has not been fully acknowledged by the governments that continue to press for exceptional access. American intelligence agencies recommend that ordinary citizens use strong end-to-end encryption to protect themselves from foreign adversaries. At the same time, American law enforcement agencies argue that the spread of end-to-end encryption is a dangerous development that hampers legitimate investigations. The United Kingdom government orders Apple to create backdoor access to encrypted data. Security experts warn that any backdoor created for British law enforcement would create a vulnerability exploitable by the same Chinese government whose espionage operations prompted American officials to recommend encrypted messaging. These contradictions are not merely rhetorical inconveniences. They reflect a genuine incoherence in the policy positions of governments that simultaneously depend on strong encryption for their own security and seek to weaken it for their own surveillance.
In the European Union, the conflict has centered on a proposal commonly referred to as Chat Control, a legislative initiative advanced by the European Commission that has gone through multiple iterations and faced sustained opposition from privacy advocates, security researchers, and members of the European Parliament. The proposal would require messaging platforms to scan the content of communications for child sexual abuse material, even when those communications are end-to-end encrypted. The mechanism proposed to achieve this without technically breaking encryption is client-side scanning, in which the scanning occurs on the user’s device before the message is encrypted and sent. The analysis of this approach by technical experts has been clear: client-side scanning, whatever it is called, functions as a surveillance mechanism built into the user’s own device that monitors communications before they are protected by encryption. Calling the communications encrypted while scanning them before encryption is applied is a form of technical misdirection.
The Chat Control proposal has been repeatedly delayed, modified, and in some iterations withdrawn under pressure, but it has also repeatedly returned in forms that attempt to address the most prominent objections while preserving the core requirement for scanning. The pattern reflects a determination on the part of its advocates that is resistant to the technical objections raised by experts, and a political calculation that the protection of children from exploitation is a cause sufficiently powerful that it can justify surveillance measures that would otherwise be unacceptable. The use of child protection as the primary justification for surveillance infrastructure is not new. It has appeared in encryption debates across multiple countries and multiple decades, and civil liberties advocates have noted that the infrastructure built ostensibly to identify child exploitation material rarely remains limited to that purpose once it exists.
The United Kingdom’s approach under the Investigatory Powers Act represents a model of government surveillance authority that other countries have observed with interest. The Act’s provisions include not only the ability to compel technical assistance from companies but also constraints on those companies’ ability to disclose that they have received demands, creating a system of secret compelled cooperation that operates outside normal public accountability. When Apple withdrew its Advanced Data Protection feature from the United Kingdom rather than comply, it made the existence of such demands visible, but only because it chose not to comply. A company less willing to take that public stance, or operating in a jurisdiction where the consequences of noncompliance were more severe, might have quietly built the access mechanism and said nothing.
France attempted in early 2025 to include a provision in legislation targeting drug trafficking that would have required access to end-to-end encrypted communications for intelligence purposes. The measure was rejected by the National Assembly after debate that included arguments about the systemic security risks it posed. Sweden faced legislative proposals that would have imposed surveillance obligations on encrypted services and prompted Signal, one of the most trusted encrypted messaging applications, to threaten withdrawal from the Swedish market entirely. Switzerland, a country with a traditionally strong reputation for privacy protection, faced legislative changes that would have expanded surveillance capabilities in ways that led Proton, a Swiss privacy technology company, to announce it would relocate its physical infrastructure outside the country if such measures were implemented.
Freedom House documented in a 2024 report that in at least seventeen of the seventy-two countries it surveyed, encrypted services had been blocked within the preceding five years. This figure includes outright blocking of services rather than legal demands for exceptional access, representing a blunter form of the same impulse to prevent citizens from communicating beyond the reach of governmental surveillance. Countries that block encrypted services tend to be those ranked as not free or partly free in assessments of internet freedom, but the direction of pressure in democratic countries has been toward legal mandates for access rather than outright blocking, which is a more sophisticated and more durable approach to the same goal.
The argument that is most difficult to answer, for those who want to preserve strong encryption, is the genuine harm caused by its abuse. The crimes that encrypted communications enable or facilitate are real. The suffering of victims is real. The frustration of investigators who cannot access evidence they know exists behind an encryption barrier is real. The response of encryption advocates that governments always find other means of investigation, through metadata, physical surveillance, informants, device seizure, and the many other tools available to law enforcement, is true but not fully satisfying to those focused on the specific cases where those other means are insufficient.
What the encryption debate ultimately comes down to is a question about how to distribute a fundamental tradeoff between security and privacy at a societal level. Strong encryption protects everyone, including criminals. Weak encryption or backdoored encryption exposes everyone to the risks that weaknesses create, including the innocent users whose data becomes accessible to adversaries through the same vulnerabilities that governments claim they will control. There is no technical solution that resolves this tradeoff. There is only a policy choice about which side of it a society is willing to live on, and the consequences of that choice will be borne by the billions of people whose communications flow through encrypted systems they may never fully understand but on which they genuinely depend.
The consistency with which cryptographers, security researchers, and civil liberties organizations have made the case for strong encryption, and the consistency with which governments have returned to demand access in spite of those arguments, suggests that this conflict is not going to be resolved by better technical explanations or more persuasive policy arguments. It reflects a fundamental tension between the interests of states in monitoring their populations and the interests of individuals in communicating privately, a tension that has existed throughout human history and that the invention of strong public-key cryptography in the 1970s and 1980s resolved, for the first time, decisively in favor of individuals. Governments have been trying to recover that lost ground ever since.