On the surface, the Apple-FBI case seemed uncomplicated. The FBI had obtained the iPhone 5c belonging to slain gunman Syed Farook. Farook and his wife shot dead 14 people in a terrorist attack in San Bernardino, California, last December. Law enforcement authorities wanted to go through messages and other data on the phone. But iPhones have inbuilt encryption, computer software that automatically encodes data into an unreadable form called ciphertext, which is only decrypted when the phone is unlocked with its password. Guessing the four-digit password wasn't an option, as data is automatically erased after too many incorrect attempts.
So the FBI asked Apple to modify the device's software – adding backdoor access – to enable the FBI to work out the password without data being erased. The problem is that once such a modification exists, it creates a weakness that makes each of the tens of millions of users of iPhones, vulnerable. It's as if every household door required a special lock that would open for the householder's key, but also for a global masterkey. Useful if gardaí needed to quickly enter a house for a legitimate reason, but everyone's house would be vulnerable if that masterkey was stolen or copied.
One might argue, this is not a house, just a phone. However, phones carry the intimate details of our lives: text and chat messages, location data, emails, apps for accessing bank accounts, health data, browser history, photos and video, and our friend, family and work contacts. That’s why encryption is a critical, everyday, generally unnoticed part of phone operating systems. We want to protect our information from the nosy, but also, criminals and hackers. In many places in the world, human rights activists need to shield it from repressive state regimes.
Apple, backed by dozens of other companies and civil society organisations, refused to provide a way of unlocking the phone, arguing that such a tool would place everyone at risk, establishing an alarming precedent. Internationally-renowned security authority Bruce Schneier wrote of the case: "We cannot build a backdoor that only works for a particular type of government, or only in the presence of a particular court order. Either everyone gets security or no one does. Either everyone gets access or no one does".
Last Monday, the FBI suddenly dropped its case, noting it had broken into the phone. Instead of defending a court case, Apple will rush to find and fix the system weakness that allowed such access. This unexpected denouement is not the end, however, but only the beginning of a broader and critical societal debate, often presented as finding a balance between privacy and security. But the Apple-FBI case has shown that it is, more correctly, about defining and balancing different types of security. If the digital world cannot function without encryption, whose security prevails?