In July 2024, a memory safety error in the CrowdStrike software brought hospitals, airlines, and companies around the world to a grinding halt. The same frailties exist in almost all medical and diagnostic devices. TTP’s Tim Guite on how the problem of memory safety is changing regulated software.

Software, famously, is eating the world. Venture capitalist Marc Andreessen’s investment thesis that software will take over the value chains of entire industries continues to be proven right every day.

Medical devices are no exception. They increasingly rely on software to provide functionality and flexibility that would not be possible otherwise. But despite the appearance of software as a cutting-edge phenomenon, key tools at the heart of regulated software were originally developed in the 1980s, and they have fatal flaws that continue to pose massive risks.

Memory safety issues are one of the leading causes of defects in software of all kinds and are often the key that a bad actor needs to gain control of the system. By now the problem has gained enough attention for cybersecurity agencies in the U.S.A., U.K., Australia, Canada, and New Zealand to urge organizations to “act immediately to reduce, and eventually eliminate, memory safety vulnerabilities from their products.”

Medical device manufacturers who start using memory safe languages now will provide more secure, reliable products to their customers, reduce the risk of costly security breaches, and be able to focus on what differentiates their devices.

What is “Memory Safety”?

Consider software running on a diagnostics instrument. It needs to keep track of lots of information, such as the current time, probe temperature, and patient ID.

This information is kept in memory at a specific address. Imagine each piece of information as a book on a library shelf. A “memory safe” system ensures that reading or writing in one of the books does not allow reading or writing in any of the other books.

When a system does not ensure this, it can cause huge problems ranging from incorrect colors on the screen to leaking confidential information and providing access to hackers.

Information is stored in a computer like books in a library.

A 2019 report from Microsoft estimates 70% of critical security vulnerabilities in their software arose from problems with memory safety.

In July 2024, a memory safety error in CrowdStrike software brought hospitals, airlines, and companies around the world to a grinding halt, causing billions of dollars’ worth of damages. This is just the latest in a long line of issues which can provide attackers with full access to systems and private information such as Heartbleed, the Slammer Worm, and LogoFAIL.

What are the solutions?

Against this background, regulatory bodies are advising industry to move away from tools known to be ineffective at preventing memory safety errors. Primarily, this refers to the C and C++ programming languages which have been used in regulated software development for several decades. While the risks of using these languages can be mitigated by layering additional checks and tests on top, this can be seen as akin to using an old quilt for protection after the invention of Kevlar.

There are innovations on the hardware side – the CHERI initiative is particularly interesting – and on the software side that directly address memory safety. A leading solution among the software innovations is the Rust programming language.

With memory safety as one of its core tenets, Rust is rapidly gaining traction and seeing increasing use at major software vendors and in regulated devices. Google now writes large parts of Android in Rust and has rapidly decreased the number of memory safety bugs introduced as a result. Microsoft and Amazon are also integrating it into their systems and are major sponsors of the Rust Foundation which works to improve and codify the language.

The work done by these leading software companies can be leveraged in medical device software. Many teams have had success initially building small parts of their system in Rust, then expanding to the rest of the system.

The key difference between Rust and C/C++ is the ability to clearly mark sections of code as “unsafe” from a memory perspective. By default, code in Rust can be trusted to read from and write memory safely. Code which is marked as “unsafe” can be carefully checked by engineers as it is a small part of all the code written. In C/C++, the “unsafe” code can appear anywhere. The C/C++ languages themselves offer engineers very little help in preventing these errors.

How to Start

Companies can begin by identifying projects where they can experiment with these new tools to develop the requisite engineering skills and new processes. For example, software components which have had issues with memory safety or are critical to the security of the system can be rewritten and evaluated against memory safe implementations.

There are two important factors for medical devices which need to be considered: hardware support and regulated toolchains. Most medical devices do not run on a normal computer; they use a specialized chip such as a microcontroller.

Traditionally, vendors have provided hardware support for their chips using the C language, but most hardware support in Rust is developed by the developer community. Currently, this support is not as fully featured as the vendor alternatives, and it poses challenges in the regulatory process. Companies need to investigate how this affects their software and work with vendors on better support.

Regulated toolchains are coming online, mostly driven from the automotive sector. The leading implementation here is Ferrocene which already has supporting documentation for IEC 62304. Additional offerings for hardware support and regulated toolchains are expected as demand increases.

Outlook

Future medical devices will contain software as a central component to enable new functionality and updates throughout the product life cycle. Regulators are focusing on software and cybersecurity to ensure that customers are not vulnerable to malicious data hacks or unpredictable device crashes. Therefore, it is essential that medical device manufacturers transition to new tools designed with memory safety in mind from the start. The Rust programming language is a prime candidate to be a key building block for memory safe software. By finding ways to integrate Rust into their software today, companies can ensure they stay ahead of regulations, offer the best experience for customers, and outshine their competitors.

References

Tim Guite
Consultant