Apple Challenges FBI's iPhone Demand as 'Oppressive'

Apple on Thursday asked a court to quash a judicial order that would force the company to help the Justice Department unlock an iPhone used by one of the San Bernardino attackers, arguing that the order imposed an “unprecedented and oppressive” burden on the tech giant.

The motion to vacate was the latest step in a high-stakes legal battle that could stretch out for months and possibly wind up at the Supreme Court.

The filing comes as Apple and the US government are engaging in a public back-and-forth that in the coming weeks will extend to an appearance before Congress and a court hearing. While the debate centers on a locked iPhone 5c, it has far-reaching consequences about the way a digital society balances privacy and civil liberties with law enforcement.

“This is not a case about one isolated iPhone,” Apple wrote in its motion, filed in the US District Court for the Central District of California. “Rather, this case is about the Department of Justice and the FBI seeking through the courts a dangerous power that Congress and the American people have withheld: the ability to force companies like Apple to undermine the basic security and privacy interests of hundreds of millions of individuals around the globe.”

(Also see:  Apple Being Asked for Access to Just One iPhone)

The company argues that the government is attempting to cut off a debate about the privacy issues in this case and that asking Apple to create a back door would expose personal information “to hackers, identity thieves, hostile foreign agents, and unwarranted government surveillance.”

Apple says that if the order is upheld, it would set a precedent that could lead to the firm’s being forced to hire people for a “new ‘hacking’ department’ ” to handle a large number of government requests.

The order last week from a magistrate judge in Riverside, Calif., did not ask Apple to break the phone’s encryption but rather to disable the feature that deletes the data on the phone after 10 incorrect tries at entering a password. That way, the government can try to crack the password using “brute force” – attempting thousands or millions of combinations without risking the deletion of the data.

The FBI has insisted that it is not asking for a back door or a master key, and instead argues that its requests here are narrow and limited to this case. But Apple has publicly pushed back on that in recent days, with Tim Cook, the company’s chief executive, saying it “would be bad for America” if the firm complied with the government.

“This is the hardest question I have seen in government,” FBI Director James B. Comey said Thursday at a House Intelligence Committee hearing. No matter the court outcome, he said, the broader policy question is one that the people and Congress should decide.

(Also see:  Why Even the FBI Can’t Hack the iPhone)

“It’s really about who do we want to be as a country and how do we want to govern ourselves,” Comey said.

Comey wrote in a public letter earlier this week that the iPhone in question could contain information about other terrorists. On Thursday, though, Apple cited Comey’s letter and said the government had offered “nothing more than speculation” about what the iPhone could produce.

Apple and its tech-industry supporters are casting the issue here as sweeping, with “chilling” implications. But the government is striving to paint the matter as narrowly as possible, stressing that authorities are asking for a software modification to apply to only one phone, which was used by one of the attackers, Syed Rizwan Farook.

Farook was killed in a shootout with police hours after the attack. Data on his phone, which was given to him in his job at the county health department, has remained inaccessible since it was recovered by authorities, although the FBI accessed some data that was backed up to iCloud in the months before the attack.

After Apple filed its motion, the Justice Department said that its approach to investigations had not changed but that Apple had made a “recent decision to reverse its long-standing cooperation” with orders.

“Law enforcement has a longstanding practice of asking a court to require the assistance of a third party in effectuating a search warrant,” department spokeswoman Melanie Newman said in a statement. “When such requests concern a technological device, we narrowly target our request to apply to the individual device.”

Prosecutors said in a court filing last week that Apple would be allowed to take the phone to a secure location and let the government remotely try new passwords, which means that only the tech company would have control of the new software to keep it from falling into someone else’s control.

Newman said the department’s attorneys were reviewing the filing and would respond in court.

In its motion Thursday, Apple rejected the government’s claims that this is a narrow request, arguing that acceding in this instance will undo the security of the company’s devices and expose its customers to vulnerabilities.

“In short, the government wants to compel Apple to create a crippled and insecure product,” Apple wrote in the filing. “Once the process is created, it provides an avenue for criminals and foreign agents to access millions of iPhones.”

Two key issues in the case are whether a 1789 law, the All Writs Act, permits the court to issue the order it did, and whether the burden it imposes on Apple is reasonable.

Apple said the 18th-century law was never intended to grant the courts “free-wheeling authority” to force the firm to do something that Congress has not approved. Congress has over the years debated the proper scope of industry’s assistance to the government in surveillance matters but has exempted “information services providers” such as Apple from being required to make their systems wiretap-ready, the firm noted. So the All Writs Act cannot be used to “compel assistance where Congress has considered, but chosen not to” grant such authority, Apple said.

Complying with the order, Apple argued, would create an “undue” burden that an Apple official said would require “between six and ten Apple engineers and employees dedicating a very substantial portion of their time for two weeks at a minimum, and likely as many as four weeks.”

Besides dismantling the auto-erase feature, the FBI wants Apple to disable a safety feature that imposes delays after repeated failed attempts to enter a password or passcode. And the bureau wants to be able to enter multiple passcodes electronically. Doing that would require modifying existing code, testing and validating the new software and setting up a secure facility where the FBI could try to break the passcode, said Erik Neuenschwander, Apple’s user-privacy manager.

If the order is upheld, the company argued, a host of other law enforcement requests to help unlock other iPhones will follow. That would require the firm to develop new versions of “backdoor software” every time a phone operating system changes. Apple engineers would have to testify about the software as government witnesses in trials, and new secure facilities would have to be built that are monitored around the clock, the firm’s lawyers said.

“Nothing in federal law allows the courts, at the request of prosecutors, to coercively deputize Apple and other companies to serve as a permanent arm of the government’s forensic lab,” they wrote.

The order would set a precedent “for conscripting Apple and other technology companies” to develop technology for all manner of criminal investigations, they said. Nothing would stop the government from forcing the company to help turn on a phone’s microphone or video camera to record conversations or help authorities track a person’s movements, they said.

Under the same legal theories, Apple asserted, the government could compel a drug company against its will to make drugs to carry out a lethal injection in a death penalty case, or force a journalist “to plant a false story in order to help lure out a fugitive,” or make a software company insert malicious code into its updates to enable court-ordered surveillance.

Such “sweeping powers,” the firm said, “simply are not authorized by law and would violate the Constitution.”

Apple also challenged the order on constitutional grounds, arguing that it violates the firm’s First Amendment speech rights. Computer code is “speech,” Apple argued, citing case law. It said that forcing the company to create software that will dismantle safety features built into a phone “amounts to compelled speech and viewpoint discrimination.”

In its motion, Apple again criticized the FBI for having Farook’s iCloud password changed after the attack, writing that if agents had checked with the tech company before they “inadvertently foreclosed a ready avenue,” the current court fight might have been averted.

The government is not asking Apple for a back door to circumvent encryption, said Michael Vatis, a former Justice Department official who is now a partner at Steptoe & Johnson. “Instead, the government is asking Apple to help it pick the lock on a door that Apple itself built into the phone,” he said, equating the auto-erase feature with a lock. “But I think Apple makes a strong argument that making them pick that lock is an unreasonable burden.”

Former security officials, though, say that law enforcement’s duty to protect the public is challenged by Apple’s stance.

The court order to Apple was “very specific to one phone,” said Mike Rogers, a former chairman of the House Intelligence Committee and a former FBI agent. “This notion that it opens up a privacy issue for every device out there is nonsense.”

Oral arguments in this case are scheduled for March 22 in Riverside before Magistrate Judge Sheri Pym. Among the lawyers Apple has enlisted is high-powered Republican attorney Ted Olson, a former US solicitor general.

[“source-gadgets.ndtv”]

Categorized in: