Buggy medical software: Is static analysis the cure?

Recently, the Baltimore Sun published an article on how the FDA is using static analysis tools to uncover software defects in medical devices. Software defects now account for about 20% of medical-device recalls, so it's no surprise that the FDA has become interested in tools that can: a) detect existing defects, and b) prevent new defects from occurring.

The article has generated response from editors of medical device publications, developers of medical software, a ZDnet blogger, and even medical malpractice lawyers.

Nobody has a problem with the FDA uncovering errors that could harm or kill someone. Some folks seem worried, however, that the FDA will force medical-device companies to adopt static analysis as a standard development practice. Their objections fall into several categories, including:

Static analysis tools are expensive — I’m no expert on the total cost of buying, learning, and maintaining static analysis tools. But, in their defense, they can detect a variety of bugs early in the development cycle. And the earlier you catch bugs, the cheaper it is to fix them.

Static analysis tools make too many false positives — Static analysis tools can report a software error when, in fact, none exists. Some argue that this shortcoming wastes precious development time. However, vendors of static analysis tools say that their products now implement highly advanced techniques to minimize this problem.

Static analysis can’t always predict how software will behave when it’s actually running — Agreed, but I doubt that anyone believes static analysis tools should be used alone. At QNX, we recently published a whitepaper on how developers can combine static analysis tools and runtime analysis tools to achieve higher product quality.

What about you? Do you use static analysis tools? Do you develop them? Do you think the FDA would be justified in forcing medical-device companies to use static analysis? Or are there other, better ways to ensure software quality?


Denton Gentry said...

I used KlocWork a few years ago, when it was a relatively new entrant in the market. It did find real problems, mostly regarding array indexing. It took quite a while to set up, and eventually we gave up on having individual developers try to run it. It became a responsibility of the buildmaster, who did a weekly run.

Brendan Harrison said...

I work for Klocwork, a vendor of static code analysis tools. Obviously, I’ve been following these articles with interest. Medical device companies (and all organizations developing safety- or mission-critical software) are always on the hunt for new ways to augment their software validation – we work with many customers in these markets, and have for many years. No single tool – including static analysis of course – can act as a silver bullet, so companies use a range of tools, techniques and processes. Static analysis, in my admittedly biased view, should be part of this kit bag to automate the verification of source code for certain categories of bugs. It finds real crash-causing bugs, and it also finds a variety of less critical bugs that can cause long-term software maintenance problems or can become crash-causing bugs if runtime configuration and context changes. So, this technology does provide real value and many medical device companies are already using these tools today.

Bob said...

Hi Paul,

Thanks for the link.

I'm an old QNXer myself. I started using QNX 2 in ~1991 and created several generations of medical device software with it for over 10 years. I did a little with QNX 4 but moved on before I was able to get to deep into it. Now it's all CE and XPembedded. I still have fond memories of QNX though.

Best Regards,