privacy

Weekend Reads 062918

The Internet and related digital systems that the United States did so much to create have effectuated and symbolized US military, economic, and cultural power for decades. The question raised by this essay is whether these systems, like the Roman Empire’s roads, will come to be seen as a platform that accelerated US decline. @The Hoover Institute

Article 13 reverses one of the key legal doctrines that allowed the Internet to thrive: the idea that computer networks are not “publishers” and are therefore not liable for the actions or statements of their users. This means that you can sue an individual user for libel or copyright infringement, but not the e-mail service or bulletin board or social media platform on which he did it. This immunity made it possible for computer networks to open a floodgate of content produced by independent individuals, without requiring service providers to serve as editors or moderators. —Robert Tracinski @The Federalist

The U.S. Supreme Court today ruled that the government needs to obtain a court-ordered warrant to gather location data on mobile device users. The decision is a major development for privacy rights, but experts say it may have limited bearing on the selling of real-time customer location data by the wireless carriers to third-party companies. —Krebs on Security

The need for an access model for non-public Whois data has been apparent since GDPR became a major issue before the community well over a year ago. Now is the time to address it seriously, and not with half measures. We urgently need a temporary model for access to non-public Whois data for legitimate uses, while the community undertakes longer-term policy development efforts. —Fabricio Vayra @CircleID

More and more companies, government agencies, educational institutions, and philanthropic organizations are today in the grip of a new phenomenon. I’ve termed it “metric fixation.” The key components of metric fixation are the belief that it is possible–and desirable–to replace professional judgment (acquired through personal experience and talent) with numerical indicators of comparative performance based upon standardized data (metrics); and that the best way to motivate people within these organizations is by attaching rewards and penalties to their measured performance. —Jerry Muller @Fast Company

I wish 5G, with its 490 Mbit/sec. speeds and download latency times of 17 milliseconds, was just around the corner. It’s not. I know, I know. AT&T Mobility, Verizon Wireless, and the pairing of T-Mobile and Sprint are all promising 5G real soon now. They’re … fibbing. —Steven J. Vaughan-Nichols @IT World

Digital collaboration technologies are accelerating productivity in the post-phone-call workplace, but tools like Yammer, Workplace by Facebook, and Slack have their dark side. While these channels can help speed group decision-making, they also serve as an enterprise blind spot for insider threats to do their worst – not to mention being open conduits for spreading negativity and toxic behaviors among the ranks. —Ericka Chickowski @Dark Reading

We have reached a point in the evolution of cyber security where handsoff, behind-the-scenes cyber defense should be the norm. Clearly, the best solution would be to deploy less-vulnerable systems. This is a topic that has received great attention for approximately five decades, but developers continue to resist using tools and techniques that have been shown to be effective, such as code minimization, employing formal development methods, and using type-safe languages. —Josiah Dykstra, Eugene H. Spafford @ACM

Weekend Reads 060818

Migrating to IPv6 will make you ready for the next stage of the Internet. I was the principal network design engineer and member of the project team for deploying native IPv6 for all residential home users at Vodafone New Zealand two years ago. As an outcome of that project, IPv6 has been deployed for about 80% of residential home Internet users now. —Mansour Ganji @APNIC

After being embroiled into controversies over its data sharing practices, it turns out that Facebook had granted inappropriate access to its users’ data to more than 60 device makers, including Amazon, Apple, Microsoft, Blackberry, and Samsung. —Swati Khandelwal @The Hacker News

Over the past few years, researchers have started to choreograph vulnerability announcements to make a big press splash. Clever names — the e-mail vulnerability is called “Efail” — websites, and cute logos are now common. Key reporters are given advance information about the vulnerabilities. Sometimes advance teasers are released. Vendors are now part of this process, trying to announce their patches at the same time the vulnerabilities are announced. —Bruce Schneier

Yes, there are habits of highly effective cyber-criminals use to be successful! We can leverage the knowledge of these habits to better prepare, defend, and attribute attacks. —Barry Greene

There’s a newly announced set of issues labeled the “EFAIL encryption flaw” that reduces the security of PGP and S/MIME emails. Some of the issues are about HTML email parsing, others are about the use of CBC encryption. All show how hard it is to engineer secure systems, especially when those systems are composed of many components that had disparate design goals. —Adam Shostack @Dark Reading

Ireland is home to over 1,000 multinational companies and hyperscale providers for some of the biggest brands, including Google, Facebook and Amazon—all of which have a vast data center presence on the island. With many tech companies opting to make Ireland one of their bases, the country boasts a growing talent pool and tech community that makes it a top choice for multinational businesses to invest in. —Tanya Duncan @Data Center Journal

GDPR, the European Union’s new privacy law, is drawing advertising money toward Google’s online-ad services and away from competitors that are straining to show they’re complying with the sweeping regulation. The reason: the Alphabet Inc. ad giant, Google, is gathering individuals’ consent for targeted advertising at far higher rates than many competing online-ad services, early data show. That means the new law, the General Data Protection Regulation, is reinforcing—at least initially—the strength of the biggest online-ad players, led by Google and Facebook Inc. —Nick Kostov @Market Watch

Last Friday ICANN took German registrar EPAG to court in Germany. German courts seem to be pretty fast, so instead of having to wait weeks or months to see how they’d rule, we’ve already got the answer. The German court in Bonn has ruled that EPAG (Tucows) is not obliged to collect extra contacts beyond the domain name registrant. The decision, naturally, is in German, but there is a translation into English that we can use to understand how the court arrived at this decision. —Michele Neylon @CircleID

The Internet is composed of networks that rely on each other to provide global connectivity. Consequently, the reachability of a network depends greatly on the connectivity of other networks, and the understanding of interdependencies between ASes is essential for deployment decisions, routing decisions, and connectivity troubleshooting. Our new tool measures AS dependencies and addresses the following questions… —Romain Fontugne @APNIC

Weekend Reads 060118: GDPR Heavy

“But what’s the harm?” Far too often, this is one of the biggest questions posed in debates about the value of privacy and the costs of violating it in the United States. Just last fall, the Federal Trade Commission conducted a workshop exploring the contours of “informational injury”, in which CDT participated. —Joseph Jerome @CDT

WHOIS is a service that was inherited from the pre-ICANN registries and has never had a formal definition or rationale beyond that’s the way it’s always been. None of the attempts to rationalize WHOIS have gone anywhere, and there was a broad agreement that the processes had been repeatedly derailed by trademark lawyers who want a one-stop source for whom to sue if someone utters their client’s name in vain. —John Levine @CircleID

Email addresses are seemingly simple to eliminate in theory, devilishly difficult in practice, and potentially expensive mistakes under GDPR. Send an unreacted address to the wrong place, and someone in Europe becomes a Euro Millionaire. Whoops. —Neil Schwartzman @CircleID

The General Data Protection Regulation is here, and soon we will see if it ushers in a new era of individual empowerment or raises novel barriers to innovation in technology. Fears of unclear mandates and uneven enforcement have led to the common refrain from company leaders, particularly in the U.S., that innovation will be stymied by draconian regulations and ex ante enforcement will create work without meaningful privacy improvements for individuals. —Nuala O’Conner @CircleID

Microsoft and Google are jointly disclosing a new CPU security vulnerability that’s similar to the Meltdown and Spectre flaws that were revealed earlier this year. Labelled Speculative Store Bypass (variant 4), the latest vulnerability is a similar exploit to Spectre and exploits speculative execution that modern CPUs use. —Tom Warren @The Verge

A newly announced vulnerability in iOS (and, just maybe, Android) could be an avenue for exploitation through misbehaving apps. The vulnerability, named “ZipperDown” by Pangu Lab, is described as a “common programming error” by the researchers — so common, in fact, that the team estimates 15,978 out of 168,951 iOS apps (or nearly 10% of the total) are affected. @Dark Reading

Your mobile phone is giving away your approximate location all day long. This isn’t exactly a secret: It has to share this data with your mobile provider constantly to provide better call quality and to route any emergency 911 calls straight to your location. But now, the major mobile providers in the United States — AT&T, Sprint, T-Mobile and Verizon — are selling this location information to third party companies — in real time — without your consent or a court order, and with apparently zero accountability for how this data will be used, stored, shared or protected. @Krebs on Security

But blockchain technology — the endless link of cryptography-secured records that gave us Bitcoin but whose potential for other uses is limitless — is as controversial as it is conspicuous. Those who believe in the power of blockchain will take their worship to a near-religious level, while those who remain skeptical (or simply confused) by the complicated technology will tell you that it’s all hype. It’s a house of cards destined to fall, they’ll say, or they’ll tell you hackers will soon seize control of the entire system and leave us all penniless and destitute. —Michael Raziel @Dark Reading

Weekend Reads 052518

Without adtech, the EU’s GDPR (General Data Protection Regulation) would never have happened. But the GDPR did happen, and as a result websites all over the world are suddenly posting notices about their changed privacy policies, use of cookies, and opt-in choices for “relevant” or “interest-based” (translation: tracking-based) advertising. Email lists are doing the same kinds of things. @Doc Searl’s Weblog

A newly-uncovered form of DDoS attack takes advantage of a well-known, yet still exploitable, security vulnerability in the Universal Plug and Play (UPnP) networking protocol to allow attackers to bypass common methods for detecting their actions. —Danny Palmer @ZDNet

Today, that’s coming in the form of imperceptible musical signals that can be used to take control of smart devices like Amazon’s Alexa or Apple’s Siri to unlock doors, send money, or any of the other things that we give these wicked machines the authority to do. That’s according to a New York Times report, which says researchers in China and the United States have proven that they’re able to “send hidden commands” to smart devices that are “undetectable to the human ear” simply by playing music. —Sam Barsanti @AVI News

In a paper we recently presented at the Passive and Active Measurement Conference 2018 [PDF 652 KB], we analyzed the certificate ecosystem using CT logs. To perform this analysis we downloaded 600 million certificates from 30 CT logs. This vast certificate set gives us insight into the ecosystem itself and allows us to analyze various certificate characteristics. —Oliver Gasser @APNIC

With cybercrime skyrocketing over the past two decades, companies that do business online — whether retailers, banks, or insurance companies — have devoted increasing resources to improving security and combatting Internet fraud. But sophisticated fraudsters do not limit themselves to the online channel, and many organizations have been slow to adopt effective measures to mitigate the risk of fraud carried out through other channels, such as customer contact centers. In many ways, the phone channel has become the weak link. —Patrick Cox @Dark Reading

Just a few years after Bitcoin emerged, startups began racing to build ASICs for mining the currency. Nearly all of those companies have gone belly-up, however—except Bitmain. The company is estimated to control more than 70 percent of the market for Bitcoin-mining hardware. It also uses its hardware to mine bitcoins for itself. A lot of bitcoins: according to Blockchain.info, Bitmain-affiliated mining pools make up more than 40 percent of the computing power available for Bitcoin mining —Mike Orcutt @Technology Review

Weekend Reads: 051118: New spectre-class vulnerabilities, scraping data, and no middle ground on encryption

A team of security researchers has reportedly discovered a total of eight new “Spectre-class” vulnerabilities in Intel CPUs, which also affect at least a small number of ARM processors and may impact AMD processor architecture as well. Dubbed Spectre-Next Generation, or Spectre-NG, the partial details of the vulnerabilities were first leaked to journalists at German computer magazine Heise, which claims that Intel has classified four of the new vulnerabilities as “high risk” and remaining four as “medium.” —Mohit Kumar @Hacker News

As cities get smarter, their appetite and access to information is also increasing. The rise of data-generating technologies has given government agencies unprecedented opportunities to harness useful, real-time information about citizens. But governments often lack dedicated expertise and resources to collect, analyze, and ultimately turn such data into actionable information, and so have turned to private-sector companies and academic researchers to get at this information. —Joseph Jerome @CDT

Despite this renewed rhetoric, most experts continue to agree that exceptional access, no matter how you implement it, weakens security. The terminology might have changed, but the essential question has not: should technology companies be forced to develop a system that inherently harms their users? The answer hasn’t changed either: no. —David Ruiz @EFF

IT-mandated password policies seem like a good idea—after all, what are the chances that an attacker will guess your exact passcode out of the 782 million potential combinations in an eight-character string with at least one upper-case letter, one lower-case letter, two numerals, and one symbol? @opensource.com

This week, Facebook held its yearly F8 Developer Conference. Now, if you haven’t heard, Facebook has been in a bit of hot water recently for its general, well, untrustworthiness. —Mike Melanson @The New Stack

A monster distributed denial-of-service attack (DDoS) against KrebsOnSecurity.com in 2016 knocked this site offline for nearly four days. The attack was executed through a network of hacked “Internet of Things” (IoT) devices such as Internet routers, security cameras and digital video recorders. A new study that tries to measure the direct cost of that one attack for IoT device users whose machines were swept up in the assault found that it may have cost device owners a total of $323,973.75 in excess power and added bandwidth consumption. @Krebs on Security

Last month, Wired published a long article about Ray Ozzie and his supposed new scheme for adding a backdoor in encrypted devices. It’s a weird article. It paints Ozzie’s proposal as something that “attains the impossible” and “satisfies both law enforcement and privacy purists,” when (1) it’s barely a proposal, and (2) it’s essentially the same key escrow scheme we’ve been hearing about for decades. @Schneier on Security

Weekend Reads 032318

Today, we are discussing some of our more complex, heuristic techniques to detect malicious use of this vital protocol and how these detect key components of common real-world attacks. These analytics focus on behavior that is common to a variety of attacks, ranging from advanced targeted intrusions to the more mundane worms, botnets and ransomware. Such techniques are designed to complement more concrete signature-based detection, giving the opportunity to identify such behavior prior to the deployment of analyst driven rules. —John Booth @Azure

List of Third Parties (other than PayPal Customers) with Whom Personal Information May be Shared

There appears to be a huge disconnect here between the EU’s professed concern for keeping Europeans safe — as expressed in the one-hour rule — and the EU’s actual refusal to keep Europeans safe in the offline world. The result is that Europeans, manipulated by an untransparent, unaccountable body, will not be kept safe either online or off. And what if the content in question, as has already occurred, may be trying to warn the public about terrorism? —Judith Bergman @ Gatestone

Ransomware has long been a headache for PC and smartphone users, but in the future, it could be robots that stop working unless a ransom is paid. Researchers at security company IOActive have shown how they managed to hack the humanoid NAO robot made by Softbank and infect one with custom-built ransomware. The researchers said the same attack would work on the Pepper robot too. —Danny Palmer @ZDNet

O3b (other three billion) is an MEO-satellite Internet service provider. Greg Wyler founded the company, and it was subsequently acquired by SES, a major geostationary-orbit (GSO) satellite company. (Wyler moved on to found future LEO Internet service provider OneWeb). —Larry Press @CircleID

Last September, at SNIA’s Storage Developer’s Conference, I presented a prototype of the Project Denali SSD. Project Denali drives provide the flexibility needed to optimize for the workloads of a wide variety of cloud applications, the simplicity to keep pace with rapid innovations in NAND flash memory and application design, and the scale required for multitenant hardware that is so common in the cloud. —Laura Caulfield @Azure

President Donald Trump blocked Singapore-based Broadcom Ltd.’s $117 billion hostile bid for U.S.-based Qualcomm Inc. earlier this week, citing national security concerns after the Committee on Foreign Investment in the U.S. (CFIUS) raised similar questions. This is the first time a U.S. president has used executive powers to block a private company’s acquisition. Why are President Trump and CFIUS so concerned with this particular acquisition? It all boils down to a race against China over 5G technology. —Helen Raleigh @The Federalist

Unless you live in a pineapple under the sea with a talking sponge, you’re probably familiar with the never-ending parade of cute animal pictures sent by text and email—friend to friend, email list to subscriber—and everywhere you look on social media. Hackers are counting on that. —Adam Levin @Marketwatch

Much of the discussion about GDPR focuses on the sticker shock of potential fines, which are only part of the significant changes this regulation introduces for businesses that collect or process personal data. —Tom Bienkowski @Arbor

Modern web-scale data centers are thirsty for bandwidth. Popular applications such as video and virtual reality are increasing in demand, causing data centers to require higher and higher bandwidths — both within data centers and between data centers. In this blog post, we will briefly discuss the current challenges in the optics space as well as some of the key technical aspects of the Voyager’s DWDM transponders. In part two of this series, we will cover why Voyager is a unique, powerful and robust solution. — Dian Patton @Cumulus

It’s a bit of an understatement to say that networking and distributed computing IT infrastructures are undergoing change. It’s fairly easy to point to well-entrenched trends like disaggregation, software-defined networking (SDN), and DevOps and understand that the fundamental ways in which infrastructure is architected, deployed, and operated are evolving faster than ever before. But what is driving all of the change? —Nitin Kumar @Juniper

Weekend Reads 011118: Mostly Security and Policy

Traveling is stressful. The last thing you want to worry about is getting scammed by crooks on the street. Your best tool? Knowledge. Know how they work. Know what they’ll do. Prevent it from happening in the first place. —Relatively Interesting

The European Union’s competition chief is zeroing in on how companies stockpile and use so-called big data, or enormous computer files of customer records, industry statistics and other information. The move diverges starkly from a hands-off approach in the U.S., where regulators emphasize the benefits big data brings to innovation. —Natalia Drozdiak @ MarketWatch

The cybersecurity industry has mushroomed in recent years, but the data breaches just keep coming. Almost every day brings news of a new data breach, with millions of records compromised — including payment details, passwords, and other information that makes those customers vulnerable to theft and identity fraud. —Alistair Johnston @ MarketWatch

To break the dominance of Google on Android, Gael Duval, a former Linux developer and creator of now defunct but once hugely popular Mandrake Linux (later known as Mandriva Linux), has developed an open-source version of Android that is not connected to Google. —Kavita Iyer @ TechWorm

China has rarely undertaken a role in developing public international cybersecurity law over the many years the provisions have existed. Only once did it submit a formal proposal — fifteen years ago to the 2002 Plenipotentiary Conference where it introduced a resolution concerning “rapid Internet growth [that] has given rise to new problems in communication security.” Thus, a China formal submission to the upcoming third EG-ITRs meeting on 17-19 January 2018 in Geneva is significant in itself. —Anthony Rutkowski @ CircleID

If all you want is the TL;DR, here’s the headline finding: due to flaws in both Signal and WhatsApp (which I single out because I use them), it’s theoretically possible for strangers to add themselves to an encrypted group chat. However, the caveat is that these attacks are extremely difficult to pull off in practice, so nobody needs to panic. But both issues are very avoidable, and tend to undermine the logic of having an end-to-end encryption protocol in the first place. —Krebs on Security

This past Friday Twitter issued what is perhaps one of the most remarkable statements in modern diplomatic history: it said both that it would not ban a world leader from its platform and that it reserved the right to delete official statements by heads of state of sovereign nations as it saw fit. Have we truly reached a point in human history where private companies now wield absolute authority over what every government on earth may say to their citizens in the online world that has become the defacto modern town square? —Kalev Leetaru @ Forbes

On differential privacy

Over the past several weeks, there’s been a lot of talk about something called “differential privacy.” What does this mean, how does it work, and… Is it really going to be effective? The basic concept is this: the reason people can identify you, personally, from data collected off your phone, searches, web browser configuration, computer configuration, etc., is you do things just different enough from other people to create a pattern through cyber space (or rather data exhaust). Someone looking hard enough can figure out who “you” are by figuring out patterns you don’t even think about—you always install the same sorts of software/plugins, you always take the same path to work, you always make the same typing mistake, etc.

The idea behind differential security, considered here by Bruce Schneier, here, and here, is that you can inject noise into the data collection process that doesn’t impact the quality of the data for the intended use, while it does prevent any particular individual from being identified. If this nut can be cracked, it would be a major boon for online privacy—and this is a nut that deserves some serious cracking.

But I doubt it can actually be cracked for a couple of reasons.

First, in context, differential privacy is a form of data abstraction. And anyone who’s paying attention knows that from summary aggregates to protocol layers, abstractions leak. This isn’t a bad thing or a good one, it’s just the way it is—the only way to truly prevent personally identifiable information from leaking through an information gathering process is to detach the data from the people entirely by making the data random.

Which brings up the second problem—the concept of gathering all this data is to be able to predict what you, as a person, are going to do. In fact, the point of big data isn’t just to predict, but to shape and influence. As folks from Google have a habit of asserting, the point is to get to absolute knowledge by making the sample size under study the entire population.

The point of differential privacy is that you can take the information and shape it in such a way is to predict what all females of a certain heritage, of a certain age, and in a certain life position will do in reaction to a specific stimulus so advertisers can extract more value from these people (and, in the background, the part that no-one wants to talk about, so that other folks can control their attitudes and behaviors so they do “the right thing” more often). If you follow this train of thought, it’s obvious the more specific you get, the more predictive power and control you’re going to have. There’s not much point in “the flu project” if my doctor can’t predict that I, personally, will catch the flu this year (or not). The closer you can get to that individual prediction, the more power data analytics has.

Why look at everyone when you can focus on a certain gender? Why focus on everyone of a certain gender when you can focus on everyone of a certain gender who has a particular heritage? There doesn’t appear to be any definable point where you can stand athwart the data collection process and say, “beyond this point, no new value is added.” At least no obvious place. The better the collection, the more effective targeting is going to be. As a commenter on Bruce Schneier’s post above says—

The more information you intend to “ask” of your database, the more noise has to be injected in order to minimize the privacy leakage. This means that in DP there is generally a fundamental tradeoff between accuracy and privacy, which can be a big problem when training complex ML models.

We’re running into the state versus optimization pair in the complexity triangle here; there’s no obvious way out of the dilemma.

Which brings me to the third point: Someone still has to hold the original data to be able to detune it in the process of asking specific questions. The person who holds the data ultimately controls the accuracy of the questions other people ask of it, while allowing themselves more accuracy, and hence a business advantage over their rivals. To some degree—and it might just be my cynicism showing—this type of thing seems like it’s aimed as much at competitors as it is at actually “solving” privacy.

I have great hopes that we can eventually find a way to stand athwart data collection and yell “stop” at the “right moment” at some point in the future. I don’t know if we’ve really figured out what that moment is, nor if we’ve figured out human nature well enough to keep people from sticking their hands in the cookie jar and using the power of leaky abstractions to respect the limits.

It’s an interesting idea, I just don’t know how far it will really go.

Should We Stop Encryption? Can We?

It’s not like they’re asking for a back door for every device.
If the world goes dark through encryption, we’ll be back to the wild west!
After all, if it were your daughter who had been killed in a terrorist attack, you’d want the government to get to that information, too.

While sitting on a panel this last week, I heard all three reactions to the Apple versus FBI case. But none of these reactions ring true to me.

Let’s take the first one: no, they’re not asking for a back door for every device. Under the time tested balance between privacy and government power, the specific point is that people have a reasonable expectation of privacy until they come under suspicion of wrongdoing. However, it’s very difficult to trust that, in the current environment, that such power, once granted, won’t be broadened to every case, all the time. The division between privacy and justice before the law was supposed to be at the point of suspicion. That wall, however, has already been breached, so the argument now moves to “what information should the government be able to trawl through in order to find crimes?” They are asking for the power to break one phone in one situation, but that quickly becomes the power to break every phone all the time on the slimmest of suspicions (or no suspicion at all).

Essentially, hard cases make bad law (which is precisely why specific hard cases are chosen as a battering ram against specific laws).

The second one? Let’s reconsider exactly why it is the laws protect personal action from government snooping without reason. No-one is perfect. Hence, if you dig hard enough, especially in a world where the size of the code of law is measured in the hundreds of thousands of pages, and the Federal tax code is over 70,000 pages long, you will find something someone has done wrong at some point within the last few years.

Putting insane amounts of law together with insane amounts of power to investigate means that anyone can be prosecuted at any time for any reason someone with a uniform might like. Keeping your nose clean, in this situation, doesn’t mean not committing any crimes, as everyone does. Keeping your nose clean, in this situation, means not sticking your neck too far out politically, or making someone with the power to prosecute too angry. We do want to prevent a situation where criminals can run wild, but we don’t want to hand the government—any government—the power to prosecute anyone they like, as that’s just another form of the “wild west” we all say we want to prevent.

By the way, who is going to force every cryptographer in the world to hand over their back doors?

Even if the U.S. government prevails in its quest to compel Apple and other U.S. companies to give the authorities access to encrypted devices or messaging services when they have a warrant, such technology would still be widely available to terrorists and criminals, security analysts say. That’s because so many encrypted products are made by developers working in foreign countries or as part of open source projects, putting them outside the federal government’s reach. For example, instant messaging service Telegram — which offers users encrypted “secret chats” — is headquartered in Germany while encrypted voice call and text-messaging service Silent Phone is based out of Switzerland. And Signal, a popular app for encrypted voice calls and text messaging, is open source. -via the Washington Post

If we’re going to play another round of “the law abiding can be snagged for crimes real criminals can’t be snagged for,” count me out of the game.

The third one? I never trust an argument I can turn around so easily. Let me ask this—would you want breakable encryption on your daughter’s phone if she were being stalked by someone who happens to have a uniform? Oh, but no-one in uniform would do such a thing, because they’d be caught, and held accountable, and…

We tend to forget, all too easily, the reality of being human. As Solzhenitsyn says—

Gradually it was disclosed to me that the line separating good and evil passes not through states, nor between classes, nor between political parties either—but right through every human heart—and through all human hearts. This line shifts. Inside us, it oscillates with the years. And even within hearts overwhelmed by evil, one small bridgehead of good is retained. And even in the best of all hearts, there remains … an unuprooted small corner of evil. -The Gulag Archipelago.

Strong encryption is too important to play games with. As Tom says—

Weakening encryption to enable it to be easily overcome by brute force is asking for a huge Pandora’s box to be opened. Perhaps in the early nineties it was unthinkable for someone to be able to command enough compute resources to overcome large number theory. Today it’s not unheard of to have control over resources vast enough to reverse engineer simple problems in a matter or hours or days instead of weeks or years. Every time a new vulnerability comes out that uses vast computing power to break theory it weakens us all. -via Networking Nerd

Anonymity isn’t a bug

Despite the bad rap it sometimes gets, anonymity – and anonymity technology – is used all the time by everyday people. Think about it: just walking in a park without being recorded or observed or “going off the grid” are common examples of people seeking to disconnect their identity from their activities. via the center for democracy and technology

The problem with anonymity and the modern Internet is we tend to think of being anonymous as either “on” or “off” all the time. The only real reason we can think of to want to be anonymous is to do something evil, to hurt someone, to steal something, or to do something else considered anti-social or wrong.

But there’s a problem with this thinking — it’s much like pitting “the rich” against “the poor,” or any other time bound classification. There are times when I want to be anonymous, and there are times when I don’t care. It’s not a matter of doing that which is nefarious. It’s more about expressing opinions you know people won’t agree with, but which the expression of could cause you material harm, or about being able to investigate something without telling anyone about the situation. For instance, support someone you love has a dread disease — is it right to violate their privacy by searching for information about the disease on the ‘web? And yet how can you hope to prevent anyone with access to the data about your browsing and your network of friends from drawing a conclusion based on actions taken? In some places (like college campuses in the US, for instance), it’s will kill your career to hold certain opinions or beliefs (conservative Christianity in general, for instance). Should people not be able to express their opinions in a way that protects them from the harm of the “twitter storm?” Or what if you move into a house only to find it’s horribly built — if you tell anyone in a way that allows you to be identified, you’ve just lost the value of the house. On the other hand, if you don’t tell anyone at all, you’re letting the builder off the hook.

While privacy can certainly be used to cover a multitude of crimes, it is also necessary to being fully human in any way that really counts.

Information wants to be protected: Security as a mindset

George-Orwell-house-big-brotherI was teaching a class last week and mentioned something about privacy to the students. One of them shot back, “you’re paranoid.” And again, at a meeting with some folks about missionaries, and how best to protect them when trouble comes to their door, I was again declared paranoid. In fact, I’ve been told I’m paranoid after presentations by complete strangers who were sitting in the audience.

Okay, so I’m paranoid. I admit it.

But what is there to be paranoid about? We’ve supposedly gotten to the point where no-one cares about privacy, where encryption is pointless because everyone can see everything anyway, and all the rest. Everyone except me, that is—I’ve not “gotten over it,” nor do I think I ever will. In fact, I don’t think any engineer should “get over it,” in terms of privacy and security. Even if you think it’s not a big deal in your own life, engineers should learn to treat other people’s information with the utmost care.

In moving from the person to the digital representation of the person, we often forget it’s someone’s life we’re actually playing with. I think it’s time for engineers to take security—and privacy—personally. It’s time to actually do what we say we do, and make security a part of the design from day one, rather than something tacked on to the end.

And I don’t care if you think I’m paranoid.

Maybe it’s time to replace the old saying information wants to be free. Perhaps we should replace it with something a little more realistic, like:

Information wants to be protected.

It’s true that there are many different kinds of information. For instance, there’s the information contained in a song, or the information contained in a book, or a blog, or information about someone’s browsing history. Each piece of information has a specific intent, or purpose, a goal for which it was created. Engineers should make their default design such that information is only used for its intended purpose by the creator (or owner) of that information. We should design this into our networks, into our applications, and into our thought patterns. It’s all too easy to think, “we’ll get to security once things are done, and there’s real data being pushed into the system.” And then it’s too easy to think, “no-one has complained, and the world didn’t fall apart, so I’ll do it later.”

But what does it mean to design security into the system from day one? This is often, actually, the hard part. There are tradeoffs, particularly costs, involved with security. These costs might be in terms of complexity, which makes our jobs harder, or in terms of actual costs to bring the system up in the first place.

But if we don’t start pushing back, who will? The users? Most of them don’t even begin to understand the threat. The business folks who pay for the networks and applications we build? Not until they’re convinced there’s an ROI they can get their minds around. Who’s going to need to build that ROI? We are.

A good place to start might be here.

And we’re not going to until we all start nurturing the little security geek inside every engineer, until we start taking security (and privacy) a little more seriously. Until we stop thinking about this stuff as just bits on the wire, and start thinking about it as people’s lives. Until we reset our default to “just a little paranoid,” perhaps.


P.S. I’m not so certain we should get over it. Somehow I think we’re losing something of ourselves in this process of opening our lives to anyone and everyone, and I fear that by the time we figure out what it is we’re losing, it’ll be too late to reverse the process. Somehow I think that treating other people as a product (if the service is free, you are the product) is just wrong in ways we’ve not yet been able to define.