WASHINGTON — The deadly shooting of three U.S. sailors at a Navy installation in December could reignite a long-simmering fight between the federal government and tech companies over data privacy and encryption.
As part of its probe into the violent incident, deemed a terrorist act by the government, the Justice Department insists that investigators need access to data from two locked and encrypted iPhones that belonged to the alleged gunman, a Saudi aviation student. The problem: Apple designed those iPhones with encryption technology so secure that the company itself can’t read private messages.
The squabble raises two big questions. First, is Apple required to help the government hack its own security technology when requested? Second, is government pressure on this issue the prelude for a broader effort to outlaw encryption technology the feds can’t break?
THE QUARREL SO FAR
The Justice Department and Apple have been in talks recently over the Saudi student’s iPhone. Justice officials contend that they still haven’t received an answer about whether Apple has the capability to unlock the devices.
During a news conference Monday announcing the findings of the Pensacola station investigation, U.S. Attorney William Barr said it’s critical for law enforcement to know with whom the shooter communicated and about what, before he died.
“So far, Apple has not given any substantive assistance,” Barr said. “We call on Apple and other technology companies to help us find a solution so that we can better protect the lives of the American people and prevent future attacks.”
Apple rejected that characterization. “Our responses to their many requests since the attack have been timely, thorough and are ongoing,” the company said.
TRYING THE BACKDOOR
Our phones hold countless messages, files and photos — tracings of our everyday life and work. But in 2013, the whistleblower Edward Snowden revealed the extent to which the government was spying on U.S. citizens. Tech companies like Apple and Google began taking steps to shield those digital tracings from prying eyes — though often not their own — by mathematically scrambling them with encryption.
Apple was one of the first major companies to embrace stronger “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, wants access to that information in order to investigate crimes such as terrorism or child sexual exploitation.
Barr and other top cops call the problem “going dark,” as data they used to be able to scoop up with wiretaps has become harder and harder to read.
Although most law enforcement officials are vague about how to solve the problem, security experts say the authorities are basically asking for an engineered “backdoor” — a secret key that would let them decipher encrypted information with a court order.
But the same experts warn that such backdoors into encryption systems make them inherently insecure. Just knowing that a backdoor exists is enough to focus the world’s spies and criminals on discovering the mathematical keys that could unlock it. And when they do, everyone’s information is essentially vulnerable to anyone with the secret key.
WHAT LAW ENFORCEMENT CAN DO
Forcing tech companies to engineer backdoors into their security systems would almost certainly require an act of Congress. Legislators, however, have never come close to agreeing on what such a law should look like.
But there are alternatives. Four years ago, the Justice Department took the extraordinary step of asking a federal judge to force Apple to break its own encryption system. The legal move involved an iPhone used by the perpetrator of a December 2015 mass shooting in San Bernardino, California.
Apple acknowledged that it could create the software the feds wanted, but warned that it would be a bad idea. The software could be stolen by hackers and used against other iPhones, the company warned, and might also lead to similar demands from repressive governments around the world.
The FBI ultimately dropped the case shortly before it was to go to trial, saying a “third party” had found another way of getting into the phone. It never disclosed who that party was; there is an entire industry of shadowy companies such as the Israeli firm Cellebrite that discover or pay for information on flaws in encryption systems. These firms then develop tools to essentially create their own backdoors.
Such companies do significant business with governments and law enforcement. Companies like Apple, meanwhile, do their best to close such loopholes as soon as they learn about them.
WHERE THINGS STAND NOW
Apple is reportedly bracing for another possible legal fight over encryption with the Justice Department. So far, though, there’s no clear sign that the government is headed that way .
“They’re just public shaming and asking nicely,” said Bruce Schneier, an encryption expert at the Berkman Klein Center for Internet and Society at Harvard University. “Hurting everybody’s security for some forensic evidence is a dumb tradeoff.”
Barr said the growth of consumer apps with end-to-end encryption, from Apple’s iMessage to Facebook’s WhatsApp and Signal, have aided “terrorist organizations, drug cartels, child molesting rings and kiddie porn-type rings.” But the government’s legal options could be limited.
For one thing, DOJ’s own inspector general slammed the department in the aftermath of the San Bernardino case, noting that it had made few attempts to break into the iPhone itself before filing suit. The FBI unit tasked with cracking phones had only sought outside help the day before the department asked a judge to compel Apple’s assistance, the inspector general’s report found.
The same report found that an FBI section chief knew an outside vendor had almost 90% completed a technique that would have allowed it to break into the phone, even as the Justice Department insisted that forcing Apple’s help was the only option.
Civil liberties advocates have also protested. The American Civil Liberties Union called Barr’s demands “dangerous and unconstitutional.”
“Here we are again,” Schneier said. “It’s stupid every time.”