On the surface, the kinds of industrial control systems that run a power plant or factory floor are very different from, say, a drug infusion pump sitting bedside in a hospital intensive care unit. But two security researchers say that many of these systems have two important things in common: they’re manufactured by the same company, and contain many of the same critical software security problems.
In a presentation at gathering of industrial control security experts in Florida, researchers Billy Rios and Terry McCorkle said an informal audit of medical devices from major manufacturers, including Philips showed that medical devices have many of the same kinds of software security holes found in industrial control system (ICS) software from the same firms.
The research suggests that lax coding practices may be institutionalized within the firms, amplifying their effects.
Rios (@xssniper), a security researcher at Google, and McCorkle (@0psys), the CTO of SpearPoint Security told attendees at S4 in Miami that they conducted their research out of curiosity and in an effort to branch out from investigating industrial control systems. Using eBay, they purchased second-hand medical devices, often from hospitals. They soon realized that many of names they came across were familiar: firms like General Electric, Siemens, Honeywell and Philips, among them.
“The same PLC (programmable logic controller) vulnerability that you see on iCS software, you also see on medical device software,” Rios told Security Ledger in a phone interview.
In fact, the two were able to run fuzzing programs designed for ICS software on the medical devices and find critical bugs in a matter of minutes. “We just look at the services they were running and threw the same fuzzers we wrote for ICS software at them and, within ten minutes, we found a heap overflow,” he said, describing the exploitable software vulnerability.
Among the products purchased was an XPER Physiomonitoring 5 system from Philips, which is used to monitor patient vital signs in interventional cardiology labs. The software powers nurses’ stations that communicate with medical devices from various manufacturers using the HL7 (Health Level 7) standards.
The problems they found in the gear from Philips and other products were not subtle, Rios said. “These are straight forward security issues,” he said. “You have a service that’s running as a privileged user and listening on a certain port. You send some bytes to that port and take over the device.”
McCorkle said that the problems he and Rios discovered added up to more than just a few, patchable software holes.
“This isn’t just bad programming or bad software practices – buffer overflows and heap overflows and directory traversals that we see in industrial control systems” he told Security Ledger. “It’s also bad design practices.”
In one case, the researchers found that a back door administrator account for supporting customers using the Siemens Syngo Expert software was described in the user documentation and protected by a four digit password. Syngo Expert allows physicians to remotely monitor magnetic resonance imaging (MRI) systems inside the hospital.
In another case the group studied, a maker of patient monitoring systems offered physicians a free iPad application that could be used to remotely connect to a patient monitoring system within a hospital via RDP (remote desktop protocol). However, the application stored doctor credentials locally on the iPad and gave users the ability to run any program on the central monitoring system.
“You have the ability to run whatever program you want. So, as a user, you can say ‘I don’t want to run the monitoring program, I want to run CMD.EXE or some malware,” McCorkle said. “And these are critical boxes that are monitoring patients in the ICU.”
Rios and McCorkle said that they contacted the Department of Homeland Security’s ICS-CERT about the problems they discovered. Mario Fante, a Senior Manager of Public Relations at Philips told The Security Ledger that the company had been informed of the presentation by ICS-CERT, but did not know what the content of the presentation was.
In an e-mail message, Fante said that Philips Healthcare has a comprehensive product security apparatus. Philips, he said, “operates under a global product security policy governing design-for-security in product creation, as well as risk assessment and incident response activities for vulnerabilities identified in existing products.” The company has a “global problem-tracking and escalation system” and a “global network of product security officers and teams” to assess and prioritize vulnerabilities.
But Rios was skeptical that such a robust system would have overlooked so obvious a flaw.
“Obviously, a remote unauthenticated vulnerability against the default configuration seems like a pretty obvious issue to miss during a robust security review,” he wrote.
Software problems – broadly defined – accounted for almost a quarter of all medical device recalls in 2011. Most recently, in November, the device maker Hospira issued a voluntary, nationwide recall of its Symbiq brand infusion systems after discovering a software error that caused the touch screen interfaces on the devices to respond incorrectly to user input. The problem could result in “a delayed response and or the screen registering a different value from the value selected by the user,” the company said in a statement.
Kevin Fu, a professor at the University of Michigan’s Department of Electrical Engineering and Computer Science said that things are still “at the very early stage” of addressing the issue of security in medical devices. “Some stake holders are clearly in denial. In the desktop computer world, it goes with the territory that there’s malware, but there are these pocket cultures,” he said.
Fu will teach a graduate course at the University of Michigan this semester that focuses solely on IT security issues in medical devices and teach engineering concepts and skills for creating more trustworthy software-based medical products.