Intentionally poisoning BGP routes in the Default-Free Zone (DFZ) would always be a bad thing, right? Actually, this is a fairly common method to steer traffic flows away from and through specific autonomous systems. How does this work, how common is it, and who does this? Jared Smith joins us on this episode of the Hedge to discuss the technique, and his research into how frequently it is used.

download

Recent research into the text of RFCs versus the security of the protocols described came to this conclusion—

While not conclusive, this suggests that there may be some correlation between the level of ambiguity in RFCs and subsequent implementation security flaws.

This should come as no surprise to network engineers—after all, complexity is the enemy of security. Beyond the novel ways the authors use to understand the shape of the world of RFCs (you should really read the paper; it’s really interesting), this desire to increase security by decreasing the ambiguity of specifications is fascinating. We often think that writing better specifications requires having better requirements, but down this path only lies despair.

Better requirements are the one thing a network engineer can never really hope for.

It’s not just that networks are often used as a sort of “complexity sink,” the place where every hard problem goes to be solved. It’s also the uncertainty of the environment in which the network must operate. What new application will be stuffed on top of the network this week? Will anyone tell the network folks about this new application, or just open a ticket when it doesn’t work right? What about all the changes developers are making to applications right now, and their impact on the network? There are link failures, software failures, hardware failures, and the mean time between mistakes. There is the pace of innovation (which I tend to think is a bit overblown—rule11, after all—we are often talking about new products rather than new ideas).

What the network is supposed to do—just provide IP transport between two devices—turns out to be hard. It’s hard because “just transporting packets” isn’t ever enough. These packets must be delivered consistently (jitter and drops) across an ever-changing landscape.

To this end—

[C]omplexity is most succinctly discussed in terms of functionality and its robustness. Specifically, we argue that complexity in highly organized systems arises primarily from design strategies intended to create robustness to uncertainty in their environments and component parts.

Uncertainty is the key word here. What can we do about all of this?

We can reduce uncertainty. There are three ways to reduce uncertainty. First, you can obfuscate it—this is harmful. Second, you can reduce the scope of the job at hand, throwing some of the uncertainty (and therefore complexity) over the cubicle way. This can be useful in some situations, but remember that the less work you’re doing, the less value you add. Beware of self-commodifying.

Finally, you can manage the uncertainty. This generally means using modularization intelligently to partition off problems into smaller sets. It’s easier to solve a set of well-scope problems with little uncertainty than to solve one big problem with unknowable uncertainty.

This might all sound great in theory, but how do we do this in real life? Where does the rubber hit the road? This is what Ethan and I tried to show in Problems and Solutions—how to understand the problems that need to be solved, and then how to solve each of those problems within a larger system. This is also what many parts of The Art of Network Architecture are about, and then again what Jeff and I wrote about in Navigating Network Complexity.

I know it often seems like it’s not worth learning the theory; it’s so much easier to focus on the day-to-day, the configuration of this device, or the shiny thing that vendor just created. It’s easier to assume that if I can just hide all the complexity behind intent or automation, I can get my weekends back.

The truth is that we’re paid to solve hard problems, and solving hard problems involves complexity. We can either try to cover that up, or we can learn to manage it.


Attacks on virtual private networks, like those this week targeting a trio of known vulnerabilities in Pulse Secure appliances, have intensified in recent months along with the increase in remote and hybrid work environments since the outbreak of COVID-19.


Over the last couple of years, policy-making institutions have been putting greater focus on the study of various aspects of Artificial Intelligence (AI). This doesn’t come as a surprise.


Forty-six percent of all malware uses the cryptographic protocol to evade detection, communicate with attacker-controlled servers, and to exfiltrate data, new study shows.


One position I think more people should be aware of is a CISO. What does this actually mean – besides being made redundant when a breach is announced? I have personally worked within a CISO-as-a-Service position, but I wanted to get some more insight from those who are working in the trenches daily in an in-house CISO position.


According to Help Net Security, organizations in the pharmaceutical and biotech sectors witnessed a 50% increase in digital attacks between 2019 and 2020. It appears that at least part of those attacks originated from nation-state actors who specifically sought to steal COVID-19 vaccine research.


As further proof of this, new research published today shows that the threat actor carefully planned each stage of the operation to “avoid creating the type of patterns that make tracking them simple,” thus deliberately making forensic analysis difficult.


REvil is an ambitious criminal ransomware-as-a-service (RAAS) enterprise that first came to prominence in April 2019, following the demise of another ransomware gang GandCrab.


My point is different from whether computer science is in engineering, science, or mathematics. Rather, I’m arguing that computing education intersects engineering education, but is not the same as engineering education.


We’re seeing more one-time password phishing. This is the value proposition of something like U2F, but how do we make phish-resilient authentication mechanisms?


While the interface has changed little over time, Amazon’s Simple Storage Service (S3) is anything but basic on the backend. With over fifteen years of development, the concept of “storage for the internet” spinning from retailer Amazon.com’s own interest, continues to evolve, driven these days by a sharp rise in machine-to-machine interaction.


No question about it. Intel had to get a lot of moving pieces all meshing well to deliver the “Ice Lake” Xeon SP server processors, which came out earlier this month and which have actually been shipping to a few dozen select customers since the end of 2020.


Bug-bounty programs have accelerated in the past few years. Many organizations — bewitched by bounty programs’ promise of faster vulnerability identification, improved product security, and cost-effective outsourcing solutions — find themselves facing unanticipated vulnerabilities and unexpected threats.


In a software supply-chain attack reminiscent of the SolarWinds compromise, unknown attackers used a vulnerable tool published by code checking firm Codecov for a little over two months to collect sensitive development information from the company’s clients.


Uptime Institute Members say one of their most vexing security concerns is the insider threat — authorized staff, vendors or visitors acting with malicious intent.


The concept of “passwordless” authentication has been gaining significant industry and media attention.

One of the big movements in the networking world is disaggregation—splitting the control plane and other applications that make the network “go” from the hardware and the network operating system. This is, in fact, one of the movements I’ve been arguing in favor of for many years—and I’m not about to change my perspective on the topic. There are many different arguments in favor of breaking the software from the hardware. The arguments for splitting hardware from software and componentizing software are so strong that much of the 5G transition also involves the open RAN, which is a disaggregated stack for edge radio networks.

If you’ve been following my work for any amount of time, you know what comes next: If you haven’t found the tradeoffs, you haven’t looked hard enough.

This article on hardening Linux (you should go read it, I’ll wait ’til you get back) exposes some of the complexities and tradeoffs involved in disaggregation in the area of security. Some further thoughts on hardening Linux here, as well. Two points.

First, disaggregation has serious advantages, but disaggregation is also hard work. With a commercial implementation you wouldn’t necessarily think about these kinds of supply chain issues. This is an example of the state/optimization/surfaces tradeoff. You can optimize your network more fully using disaggregation techniques, but there are going to be more interaction surfaces, and there’s going to be more state to deal with (for instance, the security state on individual devices).

There are several items on this list that also illustrate the state/optimization/surfaces tradeoff. For instance, eBPF is on the list of things to disable … but eBPF is probably going to be crucial to many future network-facing implementations. Anything that’s useful is going to inherently create attack surfaces you need to deal with. Get over it.

Second, just because you don’t think about these issues with a commercial implementation does not mean you don’t need to think about these things—it just means these kinds of things are opaque to you. Rather than trying to do the “right thing” yourself, you are outsourcing this work to a vendor. This is often a rational decision, and even might often be the right decision, but it’s a decision. We often “bury” these kinds of decisions in our thinking, not realizing we are making tradeoffs.


The Court ruled that Google did not violate copyright law when it included parts of Oracle’s Java programming code in its Android operating system—ending a decade-long multibillion dollar legal battle.


Today, however, choice is fast becoming an empty mantra as consumers face the iron law of compatibility. Forced to “opt in,” users submit to a relentless schedule of upgrades and updates among the ever-proliferating gadgets and technologies that bring us so much while governing our lives more and more.


I like my colleagues, but I’ve never met them in person. I found my own doctor; I cook my own food. My manager is 26 — too young for me to expect any parental warmth from him. When people ask me how I feel about my new position, I shrug: It’s a job.


Another day, another horrific Facebook privacy scandal. We know what comes next: Facebook will argue that losing a lot of our data means bad third-party actors are the real problem that we should trust Facebook to make more decisions about our data to protect against them.


Cookies are dying, and the tracking industry is scrambling to replace them. Google has proposed Federated Learning of Cohorts (FLoC), TURTLEDOVE, and other bird-themed tech that would have browsers do some of the behavioral profiling that third-party trackers do today.


When we think about environment problems, we naturally imagine huge smokestacks turning the sky dark and coating the trees with soot. But glitzy high tech stuff like cloud computing and cryptocurrency use a lot of energy too.


From fitness trackers to connected cars, IoT (“Internet of Things”) devices have made our lives easier and more convenient. Despite their nifty features, many harbor poor design when it comes to security and privacy of the data they collect (and often transmit) about you and your habits.


At Cisco Live 2021, Cisco announced enhancements to People Insights, a feature in its Webex platform. The enhancements monitor employee behavior in meetings and inter-office collaboration. The goal, according to Cisco, is to “increase and promote personal well-being.”


Several U.S. banks have started deploying camera software that can analyze customer preferences, monitor workers and spot people sleeping near ATMs, even as they remain wary about possible backlash over increased surveillance, more than a dozen banking and technology sources told Reuters.


In some contexts, such as employment, decision making based on arbitrary criteria is legal, and in others such as criminal sentencing, it is not. As algorithms replace human deciders, what are the considerations and consequences for decisions that are made at scale? And what are the moral or ethical implications?


Are tech giants really damned if they do and damned if they don’t (protect our privacy)?


Social media companies don’t fit into the framework of Section 230. Pushing the false dichotomy of “platform” versus “content creator” gives these companies more power. They can hide behind their status as platforms, claiming their filtering is perfectly neutral, undermining free speech all the while.


World-class chess, Go, and Jeopardy-playing programs are impressive, but they prove nothing about whether computers can be made to achieve AGI.


The average software application depends on more than 500 open source libraries and components, up 77% from 298 dependencies in two years, highlighting the difficulty of tracking the vulnerabilities in every software component, according to a new report from software management firm Synopsys.


Ambient computing is a broad term that describes an environment of smart devices, data, A.I. decisions, and human activity that enables computer actions alongside everyday life, without the need for direct human commands or intervention.


The emergence of new protocols such as DNS-over HTTPS (DoH) has resulted in some browsers changing security critical behaviour without explaining the implications to users.


In early March 2021, a hacker group publicly exposed the username and password of an administrative account of a security camera vendor. The credentials enabled them to access 150,000 commercial security systems and, potentially, set up subsequent attacks on other critical equipment.


Security automation for posture assessment has been difficult to achieve even though many standards-based and proprietary solutions have been developed. The primary problem is the complexity of solutions requiring customisation by each enterprise.


In this study, we investigate DoT using 3.2k RIPE Atlas home probes deployed across more than 125 countries in July 2019.


The SolarWinds attack, which succeeded by utilizing the sunburst malware, shocked the cyber-security industry. This attack achieved persistence and was able to evade internal systems long enough to gain access to the source code of the victim.


A team of Internet of Things security researchers has discovered vulnerabilities in the way IoT device vendors manage access across multiple clouds and users, putting both individuals and vendors at risk.


IT Home reports that Netac has the modules in its research and development department, and is currently overclocking them to hit an impossibly fast speed: 10,000MHz.


Cyber attackers are very skilled at infiltration. They’d find ways into a house through cracks and holes that the homeowner doesn’t know about. Analogically speaking, that’s what the new cyber attack group dubbed “Hafnium” did when they identified several zero-day Microsoft Exchange vulnerabilities to get into target networks.


Terraform is a tool that helps you manage various cloud infrastructure services in the form of code. You codify your infrastructure, and so it’s also known as Infrastructure as Code (IaC).


People talk about the cybersecurity job market like it’s a monolith, but there are a number of different roles within cybersecurity, depending not only on your skill level and experience but on what you like to do.


Weather bureau responsible for one-tenth of the planet wants a fixed-line connection between Australia and Antarctica, but it has warned icebergs could be an issue.


It’s reported that if the new Chia cryptocurrency takes off, the PC industry and gamers may have to grapple with a shortage of storage, as inventory of hard disk drives and solid-state drives could quickly become depleted.


If you live in the cybersecurity news cycle, you could be forgiven for thinking that ransomware is the only threat. There is always a report of another victim, a new approach, or a new crew.