Reaction: More Encryption is Bad?

This week I was peacefully reading the March 9th issue of ACM Queue when I received a bit of a surprise. It seems someone actually buys the “blame the victim” game, arguing that governments are going to break all encryption if we don’t give them what they want.

These ideas are all based on the same principle: If we cannot break the crypto for a specific criminal on demand, we will preemptively break it for everybody. And whatever you may feel about politicians, they do have the legitimacy and power to do so. They have the constitutions, legislative powers, courts of law, and police forces to make this happen. The IT and networking communities overlooked a wise saying from soldiers and police officers: “Make sure the other side has an easier way out than destroying you.” But we didn’t, and they are.

If you don’t get the point, it’s simple: the only way to really have secure communications is to give the government the keys. Once again, my inner philosopher threw up (as I recently said on a Network Break podcast). The reason I find the line of argument above so horrifying is simple: it’s just true enough to make you think about it, and maybe even fall in line. In an article from 2013, the same author says that state actors don’t bother with encryption, they just pay people to weaken publicly available encryption mechanisms. I wonder if they also pay people to write in widely read computing magazines that resistance is futile, so we should all just give up now.

The author of both of these pieces makes one naive assumption, however: the government is a wholly good force that will never, ever, even imagine using those back doors they’re so keen to get for anything other than capturing the worst of all possible criminals, terrorists, and the like.

They would never, for instance, turn anything learned through surveillance towards just general, wide scope searches, right? No, no-one would ever turn information gathered for national intelligence use over to ordinary police forces for prosecution of other things, right? Except when they do, of course.

A while back, we noted a report showing that the “sneak-and-peek” provision of the Patriot Act that was alleged to be used only in national security and terrorism investigations has overwhelmingly been used in narcotics cases. Now the New York Times reports that National Security Agency data will be shared with other intelligence agencies like the FBI without first applying any screens for privacy.

Nor would this sort of information ever be used for tracking the attitudes of children, right?

For example, the FBI says it doesn’t want to limit students’ freedom of speech. Yet the document makes clear that the Bureau essentially expects teachers and administrators to monitor and report on students’ thoughts. It encourages school officials to identify students who “engage in communications indicating support for extremist ideologies” or who are “curious about the subject matter” of extremism. But in the United States, holding beliefs and researching ideas, even extreme ones, are not crimes, and schools should be environments where inquiry — including about controversial topics — thrives.

Sure, we all want to see criminals caught. Sure, we all want to protect people. But there are many sides to security; sometimes breaking security, or allowing anyone with the right “badge” to break security, can be used for immoral things, too. We can’t forget that people can get hurt by allowing a government, any government, to dictate what can and cannot be encrypted, how strong the encryption is allowed to be, and who is allowed to break the encryption.

This isn’t a one way street, with angel robed halo wearing people on one side, and black hatted horn wearing people on the other. Either way you go, someone is going to get hurt. In the end, the only real question is—will you remain free to choose, or not?

America, and each and every one of its citizens, is safer with strong encryption. The sophistication and number of cyber attacks on the security of individuals, businesses, government agencies, and critical infrastructure are only increasing. Now is not the time to mandate defective products with weakened security. Now is the time to double down on creating the best, most secures systems and devices. This is not only for our personal privacy and security, but also it is essential for our national security.

4 Comments

  1. Mike on 15 March 2016 at 9:25 pm

    Thanks for your take Russ.

    Without diving deep into paranoia, in the instance governments have an easy way out to break crypto … who is to say one or many of the governments won’t have a compromise of some magnitude that serves up those “universal keys” to the general public?

    Strong crypto without back doors is the BEST way to go for the BEST interests of ALL people and ALL nations/countries.
    Hinder governments from easily(?) committing “unreasonable search and seizure” violations [as outlined in the Fourth Amendment of the United States Constitution].

    Because there’s already been mass breaches of government IT infrastructure.
    *cough* The data breach of US Government Employee records anyone?

    And the last thing we need to do is figuratively fall back into the days of plain-text (telnet, rsh).



    • Russ on 15 March 2016 at 11:40 pm

      Mike — thanks for stopping by and commenting. I completely agree with your comment… What people who say, “but don’t you want law enforcement to be able to look into the phone of a terrorist,” don’t seem to realize is that if you replace “law enforcement” with “stalker,” and “terrorist,” with “your daughter,” the problem suddenly becomes much different.

      🙂

      Russ



  2. Matthew Sabin on 17 March 2016 at 12:39 pm

    I’m puzzled that the press and government never seem to mention that even with all “corporate” or “public” encryption – Apple, chat-apps, open-source packages etc – there’s still quite a bit of “private” encryption. There will never be a back-door in that. It’s more a case of making the good guys encryption weaker while allowing the bad guys to keep their strong encryption.
    Seems a bit myopic.



    • Russ on 18 March 2016 at 1:42 pm

      This is a point I think Roger Halbeer made recently on his blog — that you can’t really control encryption, because someone, somewhere, is going to write the stuff. I agree on the myopic point — I don’t think people really think this stuff through.

      Thanks for the comment!

      🙂

      Russ