Last year, Cybersecurity and Infrastructure Security Agency Director Jen Easterly gave a speech at Carnegie Mellon that emphasized the “secure by default” future that federal agencies are aiming for when they talk about driving security into earlier phases of software design.

What does this look like in practice, and how is this going to affect the federal government’s software procurement policies? Those are the types of questions that federal agencies and their software providers alike are asking as they brace for the new world of “software bills of materials” (aka “SBOMs”) and their disclosure requirements.

If you want a front-row seat to watch the federal government’s effort to make software “secure by default” in the next two years, memory safety is a great class of software vulnerability to watch. Let’s take a look.

What is memory safety?

Memory safety is a particularly ripe domain for software vulnerabilities. So much so that back in November, the National Security Agency issued guidance to “help software developers and operators prevent and mitigate software memory safety issues, which account for a large portion of exploitable vulnerabilities.”

Most low-level programming languages--like C++ and C--are memory unsafe. Developers have to manually configure and allocate the machine’s memory. If you make a mistake, your program can read or write data from the wrong type of memory. And that leads to bugs, crashes, data leaks and all kinds of potential exploits.

Dealing with memory safety in these low-level languages is something that gets taught in programming classes, but it’s tedious, error prone and time-intensive work.

Memory safety vulnerabilities include buffer overflows and use-after-free errors and have accounted for the majority of application security issues disclosed by software companies. Back in 2019, Microsoft revealed that 70% of its common vulnerabilities and exposures, or CVEs, were caused by developers making memory corruption mistakes in their C and C++ code.

Memory unsafe

According to Consumer Reports’ Future of Memory Safety report, “60 to 70 percent of browser and kernel vulnerabilities—and security bugs found in C/C++ code bases—are due to memory unsafety, many of which can be solved by using memory-safe languages.”

Memory-safe languages have been around for a long time. But up until recently, they weren’t performant or scalable enough for low-level programming at the kernel or firmware level. And so a whole class of developers continued to use memory unsafe languages.

The Rust programming language really burst onto the scene as the language that nailed developer primitives for memory and performance, and has radically changed how developers can approach low-level programming with memory safety guardrails.

The Rust community has already reached 2 million developers and has become one of the most-loved of all programming languages. Major players in the industry including Google have completed major rewrites in Rust, in the name of memory safety, including much of the critical backend infrastructure (HTTP and TLS) for Android.

But what about all that other legacy code out there in production in the world’s systems?

Rewriting software is hard. It takes a lot of time, and it takes effort to convince management that it’s a worthwhile investment. It’s one thing for publicly funded government agencies and the world’s largest tech companies with their deep pockets to rewrite for memory safety. But convincing the rest of the market, and under resourced open-source maintainers, to do the right thing is a major challenge.

In its Cybersecurity Information Sheet on Software Memory Safety released late last year, the NSA described this key challenge of driving the “culture of software development towards utilizing memory safe languages.”

The power of SBOMs

The real power of SBOMs is their up-front value to the software procurement process. They provide the “list of ingredients” that has always been missing for software consumers to be able to make informed decisions on how a technology will affect their security posture.

When we can attest to what’s inside our software before we purchase it, now the laws of supply and demand economics should drive down our overall usage of software that is memory un-safe. If you are choosing which software to adopt and have multiple options, you are going to favor the software that wasn’t written in a memory unsafe language like COBOL, because you want to avoid a weakened security posture.

The NSA’s National Cybersecurity Strategy report described the federal commitment to SBOMs as central to an overall “software supply chain risk mitigation objective.” It also described its efforts to develop an “adaptable safe harbor framework to shield from liability companies that securely develop and maintain their software products and services” like the NIST Secure Software Development Framework, or SSDF.

The read-between-the-lines message here is that the federal government — the country’s largest purchaser of software — will itself be using SBOMs to instruct security compliance for procurement, and then will pressure industry with the carrot of legal indemnity if they follow similar precautions.

As SBOMs give first the government, then industry at large, the standard method to inspect software packages, markets will be able to galvanize much faster to weed out critical software vulnerabilities. Memory safety may be SBOMs first killer use case, but it will be equally interesting to watch how SBOMs play a role in the government’s carrot-and-stick measures with legal indemnities and safe harbors.

Dan Lorenc is CEO and co-founder of Chainguard. Previously he was staff software engineer and lead for Google’s Open Source Security Team (GOSST). He has founded projects like Minikube, Skaffold, TektonCD, and Sigstore.

Share:
More In Thought Leadership