The dispute at hand is regarding the question of whether or not the US government should be able to compel large tech manufacturers to create a backdoor to bypass devices’ security and access encrypted data. Are there certain situations where parties should be given the power to unlock to see personal phone data? Or is it completely unethical to have anyone at all have a backdoor access to an individual’s phone?
Those who support the FBI’s stance argue that safety is more valuable than privacy. During investigations such as the San Bernardino terrorist attack, law enforcement wants to leave no stone unturned and Apple’s reluctance to assist with bypassing the security can be perceived as an obstruction to the investigation. If Apple has the capability or resources to develop software that can grant access to data specifically for criminal cases such as these, they might as well utilize those resources rather than letting the phone sit there with its potential data/info still protected, which would be unfair to the victims and their families. However, former FBI Director James Comey defends his position stating that he did not want a backdoor method of surveillance, which is what the opposition suggested their intentions were. Comey said, “We want to use the front door, with clarity and transparency, and with clear guidance provided by law.”
On the other hand, people taking Apple’s side believe that this software that the government wants them to develop and use can – and probably will – be used against millions of other innocent people. Some of them think that the case is just a technique to gain approval for encryption backdoors. Apple’s privacy policy for its customers prevents there being any backdoor that would make it easier for law enforcement and criminal hackers alike to gain access to people’s private information stored on their phones.
One solution is to let things be the way they currently are. That is, companies like Apple aren’t liable for failing to decrypt scripts – their only responsibility is to the public, their customers, to whom they make their promise of confidentiality and security. While I support this perspective, it completely eliminates accessing data from devices in cases where this information could be used to find potential criminal contacts for example.
Another way around this issue would be to just have companies assist the government to break the encryption whenever there is a need for it. Their assistance becomes required but they don’t necessarily need to ensure that their software or devices have a backdoor before their release. This can prevent such a tool (with the power to break it) from falling into the wrong hands and creating a threat to millions.
Another possible solution to this ethical problem could be not letting companies to release any software or device unless they have the power to break it but only allowing them to break said device if they receive a court order demanding so. This way the government won’t have direct access to these and won’t be able to leverage it for surveillance purposes. The government would just need to rest assured that if the need ever arises, they will be capable of breaking it. What I don’t like about this solution is that the existence of a backdoor allows the possibility of someone with malicious intent to potentially access it and cause chaos.
I personally think that the second solution makes the most sense. This way, decryption can be done more easily with the technical assistance of the distributors of the devices or software. There is no guarantee of it being breakable which is good so hackers won’t fall upon a way to get into the system and exploit it.