In December 2015, a man – named Syed Farook – and his wife killed fourteen people and injured twenty-one others at a party of his co-workers. Incidents like these normally don’t carry any significance in tech circles, save for the fact that the iPhone he was in possession of before he was gunned down by police is the subject of a controversy between Apple and the FBI. The controversy itself has made headlines due to the implications it would have on data privacy, and it’s sparked a very fiery debate after Apple has fought the FBI, refusing to provide a way for the bureau to gain access to the man’s phone.
On 16 February 2016, Tim Cook, Apple’s CEO, wrote an open letter to customers, stating that:
opposing [the order by the FBI] is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
Apple has made a clear stance against cooperating with authorities which struck the match that ignited the entire debate over whether U.S.-born criminals and terrorists have the same digital protection rights as any other citizens.
iOS has a built-in protection feature that will automatically wipe a phone’s data after ten failed attempts to type in its PIN. Since Syed Farook’s phone is protected by a PIN number, the FBI risks losing the phone’s data – the very thing they need for their investigation – unless they somehow manage to break in through a back door that will circumvent the security that’s in place. Apple’s unwillingness to cooperate puts them at a standstill in this respect.
Needless to say, the move by Apple has been given support by other tech giants such as Google, Facebook, and Twitter.
Why This Is So Complicated
On one hand, Apple’s stance represents staunch support for the security and privacy of everyday users of its products. It’s normal for a company not to want to intentionally weaken the security of its products, both because it loses its status among its users and because it prides itself with the security it provides. Weakening a product is a strike against the collective ego of the producer.
On the other hand, authorities and public figures have spoken out against Apple, citing that such protection shouldn’t be afforded to terrorists. Others were conflicted by the issue due to its complexity, saying that forcing companies to weaken their security establishes a dangerous precedent and allows hackers to gain an advantage over their victims once they find out how to use back doors to their advantage. The publicity behind this case makes it inevitable that if an order is given for Apple to include such back doors, it won’t be long before someone other than authorities start using it.
What’s The Solution?
The moment a tech company chooses to acknowledge the right of government to demand that its manufactured devices include a back door allowing them to circumvent their security, we will have reached a point in history where data protection as a whole is no longer trustworthy. Are we sure we want to reach this pivotal moment? Would the word “encryption” mean anything anymore? Tell us your thoughts about this in a comment below!