This is the government's strongest stance yet on software security, which puts manufacturers on notice: fix dangerous coding practices or risk being labeled as negligent.
The problem I am aware of is moreso that the number of programmers that know COBOL is vanishingly small, it ... COBOL does not seem to really be taught anymore...
...so if something goes wrong at that level, you may be SOL if you cannot find an increasingly rare programmer that knows COBOL well.
The development of new product lines for use in service of critical infrastructure or [national critical functions] NCFs in a memory-unsafe language (e.g., C or C++) where there are readily available alternative memory-safe languages that could be used is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety
“Putting all new code aside, fortunately, neither this document nor the U.S. government is calling for an immediate migration from C/C++ to Rust — as but one example,” he said. “CISA’s Secure by Design document recognizes that software maintainers simply cannot migrate their code bases en masse like that.”
Companies have until January 1, 2026, to create memory safety roadmaps.
All they are asking for by that date is a roadmap for dealing with memory safety issues, not rewrite everything.
Don't assume too much from the headline, folks. They're not saying everything has to be rewritten by 2026. They're saying new product lines serving critical infrastructure should be written in memory-safe languages, and existing ones should have a memory safety roadmap.
If you're about to post about how you think that's unreasonable, I think you should explain why.
It’s one backed by a lot of data. One example is from the Android project.
The percent of vulnerabilities caused by memory safety issues continues to correlate closely with the development language that’s used for new code. Memory safety issues, which accounted for 76% of Android vulnerabilities in 2019, and are currently 24% in 2024, well below the 70% industry norm, and continuing to drop.
There’s an argument that critical infrastructure software vendors are already meeting standards for basic, non-memory related items. Yes, there are other categories, but memory safety is one that’s harder to verify. Moving to memory safe languages is an ensure a category of correctness. This excludes usage of unsafe escape hatches.
Using smart pointers doesn’t eliminate the memory safety issue, it merely addresses one aspect of it. Even with smart pointers, nothing is preventing you from passing references and using them after they’re freed.
I get what you're saying, but I think the issue with optional memory safety features is that it's hard to be sure you're using it in all the places and hard to maintain that when someone can add a new allocation in the future, etc. It's certainly doable, and maybe some static analysis tools out there can prove it's all okay.
Whereas with Rust, it's built from the ground up to prove exactly that, plus other things like no memory being shared between threads by accident etc. Rust makes it difficult and obvious to do the wrong thing, rather than that being the default.
Software manufacturers should build products in a manner that systematically prevents the introduction of memory safety vulnerabilities, such as by using a memory safe language or hardware capabilities that prevent memory safety vulnerabilities. Additionally, software manufacturers should publish a memory safety roadmap by January 1, 2026.
My interpretation is that smart pointers are allowed, as long it’s systematically enforced. Switching to a memory safe language is just one example.
Just from reading the article, is the scope just critical software infrastructure? What does that encompass exactly? Banking and military software seems easy to assume - what about embedded medical device software? Or just embedded software in general?