Last spring, Joshua Corman found himself staring in disbelief in a New Hampshire hospital. Corman, a cybersecurity researcher and self-described “good-guy hacker,” had brought his 11-year-old daughter in for blood work. After several hours of waiting, it was clear it would be a long night, so he’d left, briefly, to grab her pajamas from home. Now, he’d returned to find her fitted with a Hospira infusion pump, a network-connected model he recognized as particularly vulnerable to cyberattacks.
“Almost exactly two years after the FDA said to hospitals, ‘Stop using this device; it’s too dangerous to use,’ they’re using it on my kid,” Corman told me.
Compounding his concern was that he’d spent the prior few days working on the U.S. response to a ransomware known as WannaCry, which infiltrated medical centers worldwide, as well as some universities and telecommunications firms, temporarily hobbling their networks. Ransomware is a type of malware that typically encrypts everything on a computer’s hard drive; then, with the contents locked behind a password, attackers often demand payment from victims in exchange for not permanently destroying the files. So Corman was steeped in a fresh understanding of how unprepared many institutions were for a large-scale hack of their systems, and that their security standards often lagged behind federal recommendations.
It turns out that many of the medical devices that populate hospitals, clinics, and doctors’ offices nationwide have languished for years, or even decades. If network-connected equipment like MRI machines or infusion pumps still worked, hospitals traditionally saw little need to replace them. Now, Corman, who in addition to being a concerned father is a Cyber Safety Innovation Fellow at the Atlantic Council think tank and co-founder of the prominent volunteer collective of “white-hat” hackers called I Am the Cavalry, is leading a group of cybersecurity professionals in making the case that older devices tend to harbor glaring software vulnerabilities that hackers could exploit — and that they typically aren’t salvageable.
“The big problem is that hospitals don’t buy new devices, and they keep using really dangerous ones ad infinitum — until they just stop working,” Corman said.
Corman wants these old, unsecured devices gone from hospitals. The fear is that, beyond freezing systems or hijacking medical records as they did during WannaCry, hackers could also actively manipulate medical equipment to harm patients by, say, administering a lethal dose of medication via an infusion pump. While newer devices aren’t ironclad, they are typically built with more robust security features. So Corman and others are urging health-care providers to scrap old, or “legacy,” equipment and replace it with newer models.
To nudge health-care providers to trade up, he’s put forth an idea for an incentive program akin to “Cash for Clunkers,” the 2009 federal auto-rebate plan that aimed to run gas-guzzling cars off the road. Under that program, which was formally called the Car Allowance Rebate System, people received cash in exchange for turning in fuel-inefficient vehicles, which they could then put toward new, more efficient ones. (The program fizzled after a few months, when it depleted its allotted budget.) Similarly, in this version, health-care providers would be compensated for junking old equipment, and could use the rebates toward the purchase of new devices. Corman said he hasn’t fully worked out the economics, but he believes device makers might be willing to subsidize the program in part, since it would help them move inventory.
A proposal along those lines would be welcome at a large hospital like the Mayo Clinic in Rochester, Minnesota, says Kevin McDonald, director of clinical information security there. Tens of thousands of its networked devices were built many years ago, in what McDonald calls “a kinder, gentler time, before there was malware and ransomware.” Older machines’ vulnerabilities might include having a password that can’t be changed, running on outdated third-party software like Microsoft Windows XP, or an inability to incorporate “patches,” or software fixes. Other software bugs simply accumulate as an operating system ages.
At the moment, the medical-device Cash for Clunkers idea is just that — an idea. It’s one of many proposed solutions to come out of a Health Care Industry Cybersecurity Task Force convened in 2016 by the Department of Health and Human Services, of which the FDA, which regulates medical devices, is an operating division. Language in the report spells out, in part, that “government and industry should develop incentive recommendations to phase-out legacy and insecure health-care technologies (e.g., incentive models like Cash for Clunkers),” and calls for “better procurement practices.” Dr. Suzanne Schwartz, the FDA’s associate director for science and strategic partnerships at its Center for Devices and Radiological Health, told me that the agency has in recent years been working to bring good-guy hackers like Corman into the fold alongside more traditional constituencies like device vendors, and to incorporate their suggestions.
“Security researchers have played a very prominent role in our efforts, primarily because they are bringing in a kind of expertise from a technical perspective that frankly did not really exist in a consistent manner across the health-care community and across the medical-device community,” she said.
Schwartz said white-hat hackers began to approach her in 2013 with issues they’d identified in device software. The following year, researcher Billy Rios alerted the Department of Homeland Security to his finding that certain models of the Hospira infusion pump could be digitally manipulated. (The device maker was since acquired by ICU Medical.) His warning slowly filtered through to the FDA, which issued an advisory in 2015 discouraging hospitals from using the pump. Yet, as Corman found, it’s still in use in many clinical environments.
For his part, Corman believes that the FDA advisory reflected a shift in the agency’s thinking around cyberthreats. In prior years, he said, “we’d have to wait for somebody to die” for the FDA to take action. “I’m not even sure one death would have done it,” he added. Corman said that he and others made the case that cyberthreats work differently, and require action to block pathways to harm before someone gets hurt. “It’s not like a defect where 1 in 10,000 might have a flaw,” as with an analog device, he said. “You can go from no exploitation to 100 percent exploitation instantly.” Now, he said, the agency is more attentive to software vulnerabilities that have been identified but not breached.
As for the Cash for Clunkers proposal, in an email following our conversation, Schwartz called it an “intriguing idea that is worthy of further exploration,” and said “a deeper dive that integrates an economic analysis” would be required. That would likely fall under the remit of the Healthcare and Public Health Sector Coordinating Council (HSCC), an independent body that brings together public- and private-sector groups, she wrote. Greg Garcia, the HSCC’s executive director for cybersecurity, said the idea has been the subject of some discussion, but a serious proposal hasn’t been put forth.
More immediately implementable, Schwartz said, was a second suggestion from the task force that the FDA eventually require device makers to provide what it dubbed a “software bill of materials.” Corman analogized the concept to an ingredients list comprising all the various software programs on a device. If a large-scale cyberattack targeted one or more of the programs, at least hospitals would be aware that their machines might be compromised, and could take steps to limit any damage. In its April 2018 Medical Device Safety Action Plan, the FDA said it is considering seeking additional authorities to require the documentation as part of device makers’ premarket submissions for FDA review.
Corman said he’s encouraged by that progress, particularly since, until recently, cybersecurity risks of institutional medical devices haven’t received much mainstream attention. Instead, until the WannaCry attack occurred, most of the conversation around cybersecurity in medicine pertained to personal, at-home devices, like internet-connected pacemakers or insulin pumps, the risks of which have been well-documented in popular culture. For example, at a Miami cybersecurity conference in 2011, the late hacker Barnaby Jack demonstrated to much fanfare how a wireless insulin pump could be manipulated to deliver a lethal dose. In 2012, the television show Homeland depicted an attack on a wireless pacemaker. Former Vice-President Dick Cheney saw the episode and had the wireless features of his own internet-connected pacemaker disabled. Meanwhile, the FDA last year issued a recall for nearly half a million internet-connected pacemakers that it deemed vulnerable to potential cyberattacks. But there was a fix: The device maker, Abbott (formerly St. Jude Medical), soon made available a firmware update that plugged the holes that hackers might have exploited. All patients had to do was download it.
Corman doesn’t discount the cybersecurity risks associated with these personal products. But it’s the institutional devices, which are both broadly used and often too old to patch, that are of particular concern to him. Moreover, it can often take six or seven years for a device to move from the design stage to market. Even if hospitals replaced all of their legacy devices with current models, this year’s products don’t necessarily reflect current best practices in cybersecurity.
“Right now, you can get a device approved through FDA on Windows XP,” Corman said. “It’s super, super old. Microsoft isn’t maintaining it any more. Yet you could bring a brand-new device to market on that that’s going to live for another 15 years.”
The Mayo Clinic’s McDonald agreed, adding that much of what’s currently available falls short of ideal security standards, and that a broad, industrywide evolution must take place. “We’re still purchasing insecure medical devices,” he said.