Among the many technological achievements by Apple, the company’s security and encryption software ranks near the top. Each new version of the iPhone features strong encryption, allowing users to have their information secure from potential hackers.
The FBI, in order to continue its investigation of the San Bernardino terrorist attacks, wants Apple to create new software to counteract the intense security and unlock the iPhone of one of the terrorists. Although the software could potentially help the FBI track down whom the killer was in contact with, therefore helping to combat terrorism, it remains evident that creation of the software would be more hazardous than productive.
A critical reason the situation has escalated is because the FBI handled the initial investigation poorly. A San Bernardino County employee, instructed by the FBI, reset the killer’s iCloud password, thus preventing the phone from being able to back up its data to the cloud, as Tim Cook stated in an interview with ABC. If the password were left untouched, the phone would automatically backup its data to the iCloud, providing a way for Apple to access the data without compromising their encryption software.
Cook also noted that he offered investigators instructions for recovering the data, but the advice was not used.
So, in essence, the FBI proceeded without caution and then used the 1789 All Writs Act to force Apple to bail them out. But without iCloud records, most of the major information on the phone is already lost, which means creating hazardous software could be in vain.
I understand that for the FBI to complete a thorough investigation it is necessary to leave no stone unturned, but the possible consequences in this instance far outweigh the potential benefits.
Supporters of the FBI claim that once created, the software would only be used in this specific case. Such a viewpoint ignores the complexity of cyber security because once the software exists, hackers will undoubtedly attempt to access it.
Which raises a key issue; following the June 2015 breach of the U.S Office of Personnel Management’s system, where four million government employee records were stolen, the FBI released a statement saying, “we take all potential threats to public and private sector systems seriously.”
If that is the true, then the bureau should drop the case with Apple. The creation of the software poses a threat to the information of Apple users by making them more vulnerable to data theft. Because of the mobility and interconnectedness of iPhones to both public and private systems, requesting the software would contradict the previously released statement.
Proceeding with the lawsuit would demonstrate that the FBI has a lack of observance to its own values.
As Tim Cook stated, “this case is bad for America,” because not only could it lead to the creation of software which would allow access to the data of any iPhone, but it also sets a standard that the government can exert influence into what the private sector creates. In essence, to weaken the security systems Apple worked so hard to create would lead to a weakening of American ideals.
Michael Agnello is a Collegian columnist and can be reached at [email protected].
David Hunt 1990 • Mar 1, 2016 at 10:48 am
I’m torn on this subject in general. Certainly I am not a fan of weak encryption in general.
Specific to this case, the phone was not owned by the terrorist; it was owned by his employer – who, as I understand it, has acceded to the idea of “breaking” the encryption. Where’s the problem?
But here’s a hypothetical for you to consider. Another, even more deadly, attack happens and, through other sources it’s learned that there was in fact information on this phone that could have disrupted that new attack.
What would you think then? And if it was a relative / friend / loved one who was among the victims of an attack that – in hindsight – could have been prevented?