Apple Announces Stronger Data Encryption, to the Dismay of FBI Snoops
Photos and information you store on iCloud will be safer from hackers, spies, and the government.
Defying the snoops at the FBI, Apple has announced it is implementing end-to-end encryption options for the data people store on iCloud, making it all the more difficult for hackers, criminals, and the aforementioned government agency to access your info without your knowledge or permission.
Apple made the announcement Wednesday evening, and it should be treated as a big deal by anybody who values data security. Apple had been planning to offer users the ability to encrypt their backed-up iCloud data years ago, but it reportedly dropped the plan in 2018 after the FBI objected.
Apple currently offers end-to-end encryption on its iMessage services so that messages can't be intercepted or read by third parties (including government authorities). But most data stored on iCloud are not encrypted, leaving them available to be accessed by law enforcement with subpoenas or warrants. It also leaves those data susceptible to hacking, which has led to cases like this one from June, where a California man was convicted and sentenced to nine years in federal prison for breaking into thousands of iCloud accounts, stealing private photos and videos of nude women, and sharing them on the internet.
In its announcement, Apple invoked the increasing number of data breaches as justification for this transition: "Experts say the total number of data breaches more than tripled between 2013 and 2021, exposing 1.1 billion personal records across the globe in 2021 alone. Increasingly, companies across the technology industry are addressing this growing threat by implementing end-to-end encryption in their offerings."
As we've seen in China's attempts to crack down on protesters demonstrating against COVID-19 lockdowns, encrypted communications helped citizens organize against authoritarian surveillance from the government (and it appears as though those protests might actually be working). Apple noted that part of the reason for adding new protections is to provide an "optional level of security for users such as journalists, human rights activists, and diplomats."
This all sounds wonderful in terms of citizen privacy, so of course the FBI is grumpy about it. Never mind all the breaches. Never mind all the crimes that encryption prevents. The FBI only cares that encryption gets in the way of its own investigations.
In a statement emailed to media outlets, an FBI representative said that while the agency sees protecting data security and privacy as a "top priority," it nevertheless sees end-to-end encryption as a threat: "This hinders our ability to protect the American people from criminal acts ranging from cyber-attacks and violence against children to drug trafficking, organized crime and terrorism. End-to-end and user-only-access encryption erodes law enforcement's ability to combat these threats and administer justice for the American public."
The FBI and other law enforcement agencies in other countries are insistent that tech platforms create special backdoors that bypass encryption so that the government can access these secure data. End-to-end encryption defies warrants and subpoenas, making it difficult if not impossible for agencies to access protected data even when authorized by law.
But in practical terms, there is no such thing as a backdoor that only authorized government officials can access, even if we were to assume these officials would never abuse such access (and we shouldn't assume that). Keys and bypasses through encryption can and do escape controlled environments and risk everybody's safety. The federal government has faced a number of data breaches. It's extremely reckless for the FBI or any other government law enforcement agency to insist on these backdoors. The potential to facilitate crime, espionage, and secret government surveillance is most certainly worse than the assistance they provide.
Apple says the new bolstered iCloud encryption should be available to Americans by the end of the year and will roll out to the rest of the world in early 2023. It also announced a couple of new security features, including compatibility for physical security keys as a form of two-factor authentication, should users want that extra layer of security.
The Washington Post notes that Apple has also fully dropped its plan to scan all user photos for child porn. Apple announced this plan in 2021 to jeers from privacy experts. While few would object to the goal of wiping out child pornography, Apple's plan involved scanning every single iPhone user's images to see if any of them match a database of known images of child sex abuse. This was a significant unwarranted privacy intrusion, and experts noted that even with the best of intentions, such a system could be adapted and used for authoritarian purposes or censorship.
Apple quickly put its plans on pause, and now has apparently fully abandoned them. Federal law already requires that Apple report any images of child sexual abuse to authorities whenever it finds any in its systems, but it doesn't require monitoring of users' accounts. Violating our privacy just to make sure we weren't breaking the law seems like a pretty lousy way to treat customers, and it's good that Apple has shut that idea down.
Show Comments (37)