Culture

Junk Explained: Apple Vs The FBI, And How That Affects You

George Brandis is siding with the people cracking open iPhones, obviously.

Want more Junkee in your life? Sign up to our newsletter, and follow us on Instagram and Facebook so you always know where to find us.

On Wednesday night, Apple Chief Executive Tim Cook released a letter to his customers detailing a complicated fight the company’s having with the FBI. The FBI wanted Apple to help with an investigation by providing access to data in a terrorist’s phone. Apple refused. “We fear this demand would undermine the very freedoms and liberty our government is meant to protect,” Cook said, flooding everyone’s Facebook feed.

With Apple standing up for its users’ privacy and, in the process making a huge statement about national security, this is being called a battle over the future of encryption. But it’s pretty strange to think that this is a big corporation standing up for values like privacy, democracy and freedom against the US government.

It seems things could be a fair bit more complicated.

What Is All This About?

Earlier this week, a US Federal Court judge ordered that Apple must comply with a request from the FBI for technical assistance. The FBI wanted to recover the data from an iPhone owned by Syed Farook — a man who recently orchestrated a terrorist attack in California — and needed the manufacturer’s help to do it. Farook and his wife, Tasheen Malik, killed 14 people in a mass shooting in San Bernardino last year, and the FBI believe his phone data could determine whether he’d planned the attack in direct contact with ISIS. Unfortunately for them, Farook and Malik were killed in a shootout with police last year, and Farook didn’t leave his passcode behind.

The phone in question uses Apple’s iOS 9 software, which has a number of security features to protect its data. As CEO of information security start-up Trail of Bits Dan Guido wrote, nearly all important data, including messages and photos (even selfies), on iOS are encrypted with a combination of the phone passcode and a hardware key embedded in the device when it’s manufactured. In other words: your personal shit is pretty safe, encrypted beyond even what Apple can ordinarily retrieve.

So far, the FBI have been unable to guess the correct passcode on Farook’s phone (it’s safe to assume it’s a little trickier than ‘1234’). And matters are made more complicated by the fact it has a function that will erase data after 10 failed passcode attempts.

The FBI wants assistance getting around this feature so it can have unlimited attempts at cracking the passcode using a powerful computer than can input millions of different combinations, also known as a “brute force” attack. As Guido concludes, Apple is technically able to assist the FBI to do this; the question instead becomes whether they should.

Is This Even Legal?

Legality is tricky in this case. On the one hand, the FBI has a warrant for this information, and it will assist an investigation into a serious and heinous crime. On the other is the question of whether this will create a dangerous precedent that undermines data protection more generally. It’s strict-letter law versus public policy.

The FBI made their case to the judge on the basis of the All Writs Act, a broad piece of legislation (enacted in 1789) that is a residual source of authority to the judiciary. It’s kind of a stopgap law that allows US courts to make orders that compel a company to do something when no existing statute or law will work. There are, however, conditions as to when it can be used: it must be extraordinary circumstances, the company must be directly related to the order being requested, and it must not be an undue burden on the company.

As detailed in the court order, Judge Sheri Pym found there was “good cause” for issuing the order that Apple give “reasonable technical assistance” to the FBI. A warrant was already issued to search the phone’s contents, so legally the FBI should have access to what’s inside it. Further, it was Farook’s work phone, and the FBI has permission from his employer to search the device.

Apple is saying that despite this, the way they are being asked to get the information is not reasonable and could endanger the future of encryption worldwide.

Similar orders have been made under the All Writs Act before. Court documents reveal that Apple was ordered to assist the government in unlocking an iPhone 5S back in November 2014. Judge Kandis A. Westmore held that Apple’s reasonable technical assistance may include bypassing the iOS device’s user passcode so that the FBI could extract data. However, in this case, she made an important exception: Apple was not required to attempt to decrypt or otherwise help law enforcement access encrypted data.

After this case, Apple rolled out iOS 8, which significantly expanded encryption on Apple products, making the court order from 2014 useless on all software released later. Farook’s phone, though a 2013 model, was on iOS 9 — that’s what makes this case an “unprecedented” use of the All Writs Act, according to Apple.

Why Are Apple Making Such A Fuss About This One?

Apple is arguing, contrary to the FBI, that complying with this request will fundamentally undermine the security of all of their encryption technology. An analogy used is that building the software the FBI requests is the same as creating a master key; one that would seriously undermine all users of iOS and encryption more generally.

“The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers from sophisticated hackers and cybercriminals,” Cook said in his statement. Gizmodo journalist Kate Knibbs agrees, likening the situation to asking a safe manufacturer to create a tool that allows people to open all safes built by the company.

Some privacy experts have weighed in with concerns that this precedent would influence Russian and Chinese governments to demand greater access to mobile data. Christopher Wolf, director of privacy and information management at law firm Hogan Lovells, told The Guardian that the fact such requests might start coming from authoritarian regimes is enough reason to side with Apple.

Amnesty International has shown support for Apple based on the same concerns that such a backdoor might make it easier for some countries to abuse human rights.

Meanwhile, our own Attorney-General George Brandis has sided with the FBI. He’s argued that it’s frustrating for law enforcement, who already have a warrant for encrypted data, to face roadblocks from manufacturers in this way. Australian lawyer Josh Bornstein, speaking to ABC’s The Drum, also said he was on team FBI, stating that no company is above the law.

How Does This Affect Us?

It depends on which side of the argument you believe. According to the FBI and White House press secretary Josh Earnest, the modifications the FBI is asking for will only affect this one device. Once the iPhone is recovered, the FBI will return the phone to Apple so that the customised iOS never physically leaves the company.

But according to Apple, there’s serious risk of a knock-on effect. They argue this could not only create a new legal precedent that will force more compliance with government agencies, but also compromise security from a technical perspective. Apple says there’s a risk that the bypass to their security system, once created, will make its way into the wrong hands — and be used “over and over again, on any number of devices”.

There is almost certainly a large element of corporate interest to Apple’s strategy here too. Standing up for their customers as a protector of privacy fits perfectly into their overall branding. Plus, if their software is hackable, users will lose trust in their product.

There has been some speculation that individual iPhone users would be able to protect themselves from this kind of backdoor anyway. They could do this by setting a strong passcode manually by using a random, 11-digital numeric sequence. Technologist Micah Lee, writing for The Intercept, calculated that this kind of passcode would take 253 years to crack on an iPhone from 2012 and later.

This might be true, but this battle is shaping up to be as much about the architecture of data protection laws as the technology of encryption. As with anything to do with information privacy, this revolves around a familiar concern: will this strike the right balance between user privacy and the needs of law enforcement?

Putting commercial, technical and other concerns aside, that is the real question users and lawmakers should be asking.

Em Meller is a writer, journalist and editor with a focus on law and digital privacy. Her work has appeared in Junkee, Overland, The Justinian and on 2SER.