By David K. Shipler
We might be approaching a tipping point about privacy, as dramatized by the Apple-FBI dispute over decrypting a terrorist’s iPhone. After years of seeing privacy and safety as opposites in the war on terrorism, important segments of American society seem to be recognizing personal security and national security as parts of the same whole, not as a dichotomy in a zero-sum game. If this evolution continues, it could eventually produce a significant correction to the surveillance state that developed after the trauma of Sept. 11, 2001.
In the meantime, however, the two versions of security are colliding: the government’s rising concern about security from crime and terrorism in an age of digital encryption on the one hand, and, on the other, the public’s heightened interest in security from hackers, identity thieves, cyber-ransom demands, and—yes—government surveillance. Both sets of anxiety are justified. How to resolve the clash intelligently is far from clear.
The FBI’s effort to force Apple to create new software to disable an iPhone’s security features is propelling the courts forward in time at a faster speed than they typically travel. They usually lag well behind technology. But now they and Congress need to catch up quickly. That phone and hundreds of others sit in evidence lockers waiting to be cracked by law enforcement, requiring a creative effort by judges, legislators, prosecutors, and high-tech companies to make it possible—legally and technically—to execute a legitimate search warrant on a particular device without the risk of compromising security on all such devices.
Not since 9/11 have Americans’ worries about preserving privacy infused the society as extensively as today. The apprehensions were on display this week among members of the House Judiciary Committee, which heard testimony on the Apple-FBI dispute. Many of the lawmakers, Republicans and Democrats, sounded receptive to the privacy arguments as they heard from witnesses who included the FBI director, James Comey, and Apple’s general counsel, Bruce Sewell.
The government’s own misbehavior has helped raise citizens’ privacy concerns: the warrantless eavesdropping by a National Security Agency that was ordered by the Bush Administration to go around the Foreign Intelligence Surveillance Court; the disclosures by the NSA contractor Edward Snowden, which documented the agency’s sweeping collection of “meta-data” comprising virtually all Americans’ contacts by phone numbers and email addresses; and local law enforcement’s aggressive evasion of the Fourth Amendment’s warrant requirements to frisk pedestrians without cause, secretly attach GPS tracking devices to vehicles, and monitor people’s locations through cellular phone records.
The courts, Congress, and the Obama administration have curtailed some of these abuses, but insufficiently. And the Internet knows no national boundaries, of course, so global hackers steal identities and tax returns and US government officials’ personal data, encrypt computer files and extort ransoms to decrypt them, and—in a recent case—shut down a power grid in Ukraine.
No wonder people are a bit nervous. If they weren’t they’d surely vote 100 percent for Apple to unlock the phone. Instead, a bare majority of 51 percent favor Apple’s cooperation, according to a Pew poll with a 3.7-percent margin of error. (38 percent oppose decryption, and the rest have no opinion.) This is surprising. Public relations-wise, it’s such a perfect case for the FBI. Who wouldn’t want a dead terrorist’s phone mined for possible information on any unknown associates he might have had?
The phone is an iPhone 5c running Apple’s advanced operating system, ios9. It was found during the execution of a search warrant for the car of Syed Rizwan Farook after he and his wife murdered 14 people and wounded 22 in San Bernardino, California last December, and then were killed during a pursuit. Two other phones of theirs had been smashed; this was the only one intact. It was owned by his employer, the San Bernardino County Health Department, which consented to the FBI’s search of the device. (The FBI obtained a warrant anyway.)
Investigators want to learn whether Farook was assisted or contacted by others who might be implicated in the attack. While Apple immediately helped provide some information stored outside the phone, it could not unlock the device itself without writing new code, which it refused to do.
Some of that information would have been available through the automatic backups of the phone to the iCloud. But the FBI messed up badly. The iCloud backups stopped last October, about six weeks before the attacks. They could have been updated with at least some of the phone’s later information, but as Comey admitted in the House hearing, the FBI—without checking with Apple—asked the county to change the iCloud password to facilitate immediate access to data already there. Once that was done, the phone would no longer back up, and the more recent data resided in the phone alone.
Unlike very early versions, this operating system allows users to activate a failsafe encryption system that destroys all data after 10 erroneous passcodes are entered. The FBI wants Apple to disable that feature so that a computer can bombard the phone with a “brute-force” series of random passcodes, which Comey said should take no more than 26 minutes until the right one is found. To accomplish this, two other safeguards of ios9 would have to be disabled: one that imposes a delay after each wrong passcode, and another that requires the codes to be entered manually on the touch screen rather than by computer.
“It pains me to say this,” Comey confessed to the committee, but 16 other intelligence experts could not figure out how to unlock the phone. “Apple is very good. They set out to design a phone that can’t be opened, and they’ve darned near succeeded.”
Apple and some technicians outside the company warn that such software, if created, would be such an inviting and valuable target for hackers, criminals, and foreign governments that it might be stolen and applied to millions of other users’ phones, compromising personal medical, financial, and work-related information, plus the locations of the user’s children and other sensitive data. “There is probably more information stored on that device than a thief could steal by breaking into your house,” said Sewell, Apple’s general counsel.
In addition, encrypted smart phones are increasingly being used as log-in methods safer than typing passwords into your computer, according to Susan Landau, a professor of cybersecurity policy at Worcester Polytechnic Institute. “It’s really about security vs. security,” she said. “NSA will tell you that stealing log-in credentials is the most effective way into a system. Smart phones are poised to become authenticators into systems,” which may include power grids, water systems, and the like.
Indeed, the former NSA and CIA director Michael Hayden has sided with Apple, calling cyber insecurity the greatest danger to national security.
But couldn’t Apple keep a new code to itself? Apple and some outside technicians don’t think so. First, depending on what a court orders, the software might be available to any law enforcement agency that could get a warrant. Although the government contends in its California brief that the code would be used to crack that phone only, Comey conceded under questioning that a court victory for the FBI would set a precedent to be used in other cases, and they’re numerous. A dozen phones are in FBI hands waiting to be examined. Cyrus Vance Jr., the Manhattan District Attorney, told the House committee that his lab has 205 phones that can’t be cracked, and that other D.A.s across the country have many as well--46 in Connecticut, for example, over 100 in Houston, and so on.
Furthermore, with a barrage of warrants to handle, Apple would presumably have to create a compliance division with access to the code, according to Alex Abdo, an attorney with the American Civil Liberties Union. That division, in possession of lucrative “malicious software,” as he put it, will become “an irresistible target” for hackers, criminals, and foreign intelligence agencies.
Michael Chertoff, a former Secretary of Homeland Security, said, "Once you've created code that's potentially compromising, it's like a bacteriological weapon. You're always afraid of it getting out of the lab."
Apple won a temporary victory this week in New York, where a federal magistrate judge denied the FBI an order to compel the company to open another phone running an earlier operating system. It belongs to a New York defendant who has pleaded guilty and is awaiting sentencing for distributing methamphetamine.
The legal issues are the same as in California, where the government is trying to invoke the All Writs Act of 1789, which has been interpreted as empowering the judiciary to require the assistance of third parties in executing search warrants and subpoenas, so long as the orders are “agreeable to the usages and principles of law.” Precedents cited by the government include getting a credit card company to turn over a customer’s charge records and forcing a phone company to install a device to track numbers called.
But the cases cited do not include requiring a company to create something new, such as computer code. Nor does the federal law governing cooperation by communications companies, the Communications Assistance for Law Enforcement Act (CALEA), which is silent on matter. That was central to the New York judge’s finding that the All Writs Act was “unavailable because Congress has considered legislation that would achieve the same result but has not adopted it.”
The ball should be in Congress’s court, said AT&T’s general counsel, David McAtee, as he announced that the telecommunications giant had filed an amicus brief in support of Apple’s position. While these cases and others are likely to work their way up to the Supreme Court, courts don’t do nuance very well, and sophisticated nuance is needed to mesh legal principles with rapidly advancing technology.
The mutual security interests of citizens’ safety and privacy should not depend on ad hoc rulings “by judges presiding over individual cases,” McAtee declared, but “by Congress providing a clear, uniform legal framework for all participants in the new digital economy.” He probably has in mind a Congress that suddenly becomes functional.