In 2015, the FBI unsuccessfully tried to persuade Congress to regulate encryption. To combat the so-called problem of criminal investigations “going dark,” some law enforcement professionals want tech companies to be able to bypass their own encryption. The FBI and intelligence community accepted that although the Congress was not open to this idea in 2015, in the event of a terrorist attack where strong encryption can be shown to have hindered law enforcement, attitudes might change.
In December 2015, a shooting in San Bernardino, California killed 14 and wounded 22 others. The shooters are alleged to have been ISIS sympathizers, causing the event to quickly be categorized as a terrorist attack. One of the alleged shooters had an iPhone that the FBI wanted to access, but this was complicated because of encryption. They needed a passcode to unencrypt the contents so that the person in possession of the phone could examine the device. The FBI thus could not view the data without entering the passcode, but the phone was designed to erase everything on the device if the wrong passcode was entered ten times.
The FBI went to Apple to request their cooperation in creating a tool to bypass this protection and let them try different passcode combinations until they find one that works. This would enable them to access the suspect’s iPhone to look for evidence. Apple said no. The FBI went to court and got an order to compel Apple to do this. Apple still said no. The public relations angle started to get worse and worse for the FBI.
Many people have remarked that the FBI was seeking legal precedent for compelling service providers to break their own encryption or security protections. But with Apple refusing to comply, the FBI was now facing a different kind of precedent if an appellate court overturned the order. In many jurisdictions, computer code is recognized as being speech under the First Amendment. The Supreme Court has not ruled definitively on this yet, but if computer code is speech, this order amounts to the government ordering someone to speak and telling them what they should say. When a third party came forward and offered to hack into the suspect’s phone, the FBI apparently took them up on the offer. The lawsuit against Apple has been dropped, but the issues still remain.
Investigating terrorist attacks is undeniably important. But to what extent should we be willing to compromise everyone’s security and privacy to do so? Virtually any security measure can already be bypassed with enough determination. If Apple creates away to bypass encryption and security measures in their own products, what happens if the bad guys get access to these tools? A backdoor can be used by anyone with a key. There is no such thing as a “good guys only” backdoor.
It is also ironic that tech companies like Facebook and Google are rushing to support Apple when these companies are the ones that collect enormous personal data on all of us. To the ordinary citizen, it then comes down to: Do we trust the tech companies, that is the private sector more, or do we trust the government more with our data? Since individual privacy is so thoroughly compromised and is likely to be further eroded by the Internet of Things, we need stronger computer security to make sure that our no longer private information does not fall into the wrong hands. This is also additional reason for why we should consider carefully the implications of compromising everyone’s computer security through a back door in order to try to improve national security.