As Apple and the FBI battle over access to San Bernardino terrorist Syed Farook’s locked iPhone, it is well to remember that both sides want to keep America — and Americans — safe. The problem is that two security priorities are clashing: the need for strong data-protection and cyber-security vs. the need to investigate and thwart terrorists.
Here is some critical background. Farook’s iPhone 5C has strong encryption — so strong that the FBI can’t crack it.
The only way to get into this phone is to guess the passcode. However, Apple protects the passcode with several features, the most important being “auto-erase”: enter an incorrect passcode 10 times in a row, and the phone wipes itself clean.
It’s a great protection for regular consumers. But, in this case, it serves as a shield for terrorists. The FBI wants Apple to disable the auto-erase feature and invent new software — that does not currently exist — to disable other protections so its agents can crack the passcode.
The FBI wants to search the iPhone to learn more about Farook’s terrorist plans and possible connections to other terrorists. The cell phone could contain absolutely nothing of value to the investigation. On the other hand, it might hold information about another plot or plotters. From a law enforcement perspective, that’s a compelling reason to rummage through the iPhone.
But a legitimate law enforcement and national security rationale for accessing the phone is not the only consideration in this case. There are unintended consequences and concerns that Apple raises.
Suppose, for example, that Apple did develop software to help the FBI break into the phone. Other countries would immediately ask for the same tool. Currently, Apple can legitimately say no, because the tool does not exist. But once it was created, other countries would come knocking at Apple’s door, and some of those nations would not use this software tool for good.
Moreover, giving the FBI what it wants could start an encryption arms race. Apple and other manufacturers could respond by making it even harder to break into their products in the future.
Alternatively, criminals and terrorists could increase their use of third-party applications like Telegram to encrypt their communications. Many of these applications are developed by foreign countries. No FBI request or U.S. court order will persuade these app makers to turn over the keys to their products. In the long run, an encryption arms race might leave the FBI with even less access to information than it has now.
Perhaps most worryingly, if Apple creates this tool, it will introduce a dangerous vulnerability into millions of devices. Once that precedent is set, it will be only a matter of time before the FBI needs to unlock a newer iPhone version or another company’s device. As unlocking tools proliferate, the chances rise that one of them will fall into the wrong hands. That would give hackers or thieves a skeleton key to millions of devices. All of the important personal information you keep on your phone would be put at risk.
There’s a concerning legal question, too. What are the limiting principles to such government power? If the government can order Apple to develop software to crack its own encryption, what’s to stop it from ordering Apple to turn on the audio record function of a suspected criminal, or the video function of an iMac of a suspected terrorist? Where does the government’s power to compel end?
Ultimately, this debate is over what we, as a society, want to prioritize. Do we think that the valuable information — if any — locked in this iPhone is sufficient to outweigh the foreseeable side effects: increased vulnerability of our personal communications devices, misuse by other countries, and an encryption arms race? Many reasonable people can conclude that the FBI’s request to break down device security is too costly.
If there is a way to both protect the security of our devices AND provide the FBI with the info it needs on terrorists, Congress should embrace that option. But right now, most encryption experts seem to think such a solution is impossible.
In this case of competing security demands, we should not require Apple or others to bypass the security they have built into their devices. We all have too much to lose.
- David Inserra is an analyst specializing in homeland- and cyber-security issues in The Heritage Foundation’s Allison Center for Foreign and National Security Policy.
- This piece originally appeared in Tribune News Service.
This piece was originally distributed by the Tribune News Service