Smartphone mit Schloss

Apple is fundamentally against creating a backdoor allowing US authorities to bypass smartphone encryption. However, companies alone deciding whether backdoors are acceptable will not result in increased data security for us. We need open encryption mechanisms which can also be modified by users in case of doubt.

A showdown between Apple and the US government has already been raging in the US for some time now. In the aftermath of the San Bernadino attack, the FBI wants to investigate the perpetrator’s possible terrorist links. However, the memory on his cell, an iPhone 5C, is password-protected. Brute force attacks are of no use here – after several failed attempts, the smartphone is blocked for a period of time. There is even the risk that the perpetrator has enabled an option to erase the entire memory after ten failed password attempts.

Apple has already released a statement saying that it is cooperating with investigators. However, even following a decisive court ruling, the company is refusing to build a function for the FBI which would make it possible to bypass the encryption and export the memory directly. Apple CEO Tim Cook was up in arms at the court’s decision in an open message to Apple customers: “They have asked us to build a backdoor to the iPhone.” The software required by the FBI could “potentially unlock every iPhone” and fall into the wrong hands – in principle also a reference to the US government. After all, Mr. Cook warned that once the trick has been revealed, government authorities could exploit it to “collect your messages, health data, and financial information, locate you, or even activate your iPhone’s camera or microphone without being detected.”

So is Apple nipping things in the bud? Michael Hayden, Former Director of the National Security Agency (NSA) is already convinced to the effect. He accused FBI Director James Comey: “Jim would like a backdoor available to American law enforcement in all devices globally.” Apple has already achieved something with the extensive public debate: In stark contrast to its otherwise notorious technical secretiveness, Apple is now taking a stand as a company with a single priority: safeguarding the privacy of its customers.

However, there is one counterargument to the Apple spectacle: While backdoors do pose an enormous risk, no company should be allowed to put its authority before that of a democratic state. If at all, then it should only be for the state, not a random company, to decide who can access which data. The state will – it is only to be expected (see initial court ruling) – ultimately manage to gain access to the data it wants. As such, the amusement with which many people in this country are following the confrontation between Apple and the US government is ill-founded.

Leaving it to companies to safeguard the privacy of individuals once governments have shown that private information is suspicious is not sufficient. Then it’s only a matter of time before the backdoors follow. Effective encryption is too important to be left to managers who cannot even apply their authority to a democratic vote. Effective protection against snoopery and industrial espionage is only possible if the encryption software code is made public and can be modified by users in case of doubt. That renders the building of backdoors much more difficult. And people who truly require security and privacy can replace encryption mechanisms with ones they trust more.

Use UCS Core Edition for Free!

Download now

Peter H. Ganten is the founder and CEO of Univention.

What's your opinion? Leave a comment!

Comments

  1. The state also has no right to decide who has access to your data, leaving all this to the government is a very dangerous business!

    Reply
  2. 100% Agreed. But I do believe, that a court should have the possibility to order someone to make specific information available, in the same way a court can allow the police to search a house. This of course does only work if the information is available at all. Hence we need open source implemented encryption with the possibility to change algorithms for the person which wants to encrypt something.

    Reply
  3. I am agreed with Peter, if the implementation is open source, there will be multiple choice with consumer and other companies will also be able to offer other kind of encryption. Definitely the one who is not storing classified data on its phone is not concerned about the security , the one who is storing is definitely the educated person for security point of view and will apply required or recommended encryption method. Basically i have in my mind that “Apple” does not want to be a part of open source in any way.

    Reply
  4. As a non American

    I agree on the point that encription software should be open source (also OS’s should be). Apple being mostly against open source. Still, on this particular issue I support Apple. As an inhabitant of an other country, I’d be concerned if an other country should have a back port on my devices. If the precedence case should pass through the higher courts in the US, then Android or even other software developped in the US could be the next targets. It is an international issue, and it is important that the highest court decides on it. I am not a friend of Apple, do to its propriertary policy, but I support it on this issue.

    Reply
  5. Sure, but if it’s not open we have to believe Apple or whomever that they keep the content away from the government, we can’t very verify it. Trusted IT on the other hand requires that it can be verified and changed independently and by the user.

    Reply

Your email address will not be published. Required fields are marked *