The company has taken some laudable steps toward protecting its customers’ privacy, but recently revealed DOJ data seizures remind us that Apple needs to work harder.
By Max Eddy
On June 10, The New York Times reported that the Trump administration Department of Justice (DOJ) sought information from Apple regarding two of the then-president’s most prominent critics, who also happened to be members of Congress. Days later, the Times reported that the DOJ also sought information from Apple regarding former White House counsel Donald McGahn. It’s an uncomfortable reminder that Apple has given data to law enforcement thousands of times, and that it holds lots of sensitive data to give.
In its most recent transparency report, which covers January to June of 2020, Apple said it handed over user data to US law enforcement 2,590 times. Apple said this could include (but is not limited to) photos, emails, contacts, calendars, and iOS device backups. Impressively, there were 9,872 requests for data in that period, with the US being responsible for 5,861 of those requests.
To the company’s credit, we can ponder these numbers only because Apple supplies them in the first place. And in fairness, Apple is not alone in its disclosures. Google, the search giant and operator of the world’s most popular mobile OS, reported that during the same period in 2020, it turned over at least some user data 83% of the time. Apple reported that it responded with at least some data 87% of the time.
To someone who writes a lot about privacy, these are eyebrow-raising figures. In the world of VPNs—which I cover for PCMag—disclosing any information is unusual, as is having any significant information to disclose in the first place. ProtonVPN’s stance: “As we do not have any customer IP information, we could not provide the requested information” is more the norm. The creators of Signal Private Messenger take a similar position and have actively worked to ensure that customer data isn’t available to hackers, law enforcement, or Signal itself.
Granted, most privacy-focused companies are tiny by comparison to titans such as Apple and Google. But these smaller companies have designed their products to gather as little information as possible and to ensure that whatever information they do retain isn’t accessible to them or anyone else. Apple needs to follow this model now more than ever.
In the last few years, Apple embraced privacy as not just as good practice but also good marketing. It has added a lock-out feature to make it harder for law enforcement to force iPhone owners to unlock their devices with biometrics. Don’t like Alexa or the Google Assistant listening to your every utterance? Apple says that its on-device voice recognition can match the competition without sending recordings out for processing.
Apple has taken an aggressive approach to protecting user privacy from advertising, both on its devices and when its customers are browsing the web. Apple will generate fake email addresses to help you cut down on spam. The company has made it easier to see which apps are monitoring you and given consumers more tools to rein in that monitoring. The fact that Facebook objects to some of these changes has always struck me as a sign that Apple is doing the right thing.
Most dramatic, perhaps, was when Apple was put under pressure to unlock the iPhone of suspects who carried out a mass shooting in San Bernardino, California. Apple refused, and it was a rare moment of unanimous support across the industry.
Those were high points, but there have been low points as well. Apple caved to pressure to remove an app used by Hong Kong protestors. Of much broader note is that fact that Apple still doesn’t encrypt its iCloud backups. In January of 2020, Reuters reported that Apple had pondered encrypting iCloud backups and had even told the feds it was going to do so. This apparently never came to pass, for reasons that are unclear.
ICloud backups are mentioned explicitly in Apple’s transparency report as being among the kinds of data it currently provides to law enforcement. These contain most, if not all, of the information on an iPhone or iPad. Macs may also be set to back up data in iCloud too. An iCloud backup can (optionally) include data from Apple’s Messages app, which isencrypted end-to-end- except in backups. That might be a chilling revelation for users who assume their Messages data is always secure. Google, interestingly, does encrypt its backups.
The Reuters article presents several possibilities for why Apple reversed course on encrypting backups. Perhaps Apple wanted to play ball with the FBI. Maybe the iPhone maker foresaw the headaches of angry consumers unable to retrieve their data after forgetting a password. Or perhaps Apple worried that encrypting backups could lead to legislation that would place intrusive limits on encryption—an ongoing struggle.
There is surely a complicated calculus going on inside Apple’s steel and glass doughnut about customer privacy. But what is clear is that the current system is not working. Too much customer data is available, and there are a staggering number of legal requests for it. Finally, as the recent flurry of questionable requests from the DOJ demonstrates, the potential for abuse is all too present.
Apple may not be in a position to fight every request for information that it receives, but it can follow the example of privacy-focused companies and simply make user information irretrievable to anyone except users. While it was questionably “courageous” for Apple to do away with the headphone jack, the company has a chance to show real courage and protect its customers at the same time.