In-brief: The design of wearable technology risks repeating the mistakes of the past, including poor security and privacy features that could pose a risk to consumers, according to a new report by IEEE, an information technology professional organization.
The design of wearable technology risks repeating the mistakes of the past, including poor security and privacy features that could pose a risk to consumers, according to a new report by IEEE, an information technology professional organization.
Trusted Computing Group has how-to and demos with Microsoft, GE, Infineon, OnBoard Security, Wibu-Systems at IoT Solutions World Congress. Get your free expo pass code 111B9B47 or discount conference pass code 526E24AF
The group issued the report “WearFit: Security Design Analysis of a Wearable Fitness Tracker” (PDF) to showcase ways in which common security design flaws may impact wearable technology like fitness trackers and watches. Those flaws could leave the devices and their owners susceptible to crimes such as identity theft, malware infections and denial of service attacks, IEEE warned.
“With the popularity of wearables, we figured this was a good type of system to look at and apply our Top 10 Security Design Flaws,” said Jacob West, a security architect at the firm NetSuite, who was one of the report’s authors.
Industry analysts report that nearly half of the population is expected to use wearable fitness-tracking devices by 2019. The expected popularity of wearable technology in the coming years combined with insecure design could make the devices a “ripe playground” for hackers and cyber criminals, West said.
To underscore the dangers, the IEEE researchers designed a wearable fitness tracker called “Wearfit,” a fictitious product that was based on actual wearable products in its architecture and components, including a wearable tracker, a companion mobile application and a cloud based management platform. The goal was to underscore how proper design and planning can mitigate common types of threats and attacks.
“We want to expand the focus to have equal attention to design mistakes that lead to security problems just like software coding mistakes,” West said.
The team designed the WearFit system to emphasize integrity, allowing each component of the device ecosystem to be able to verify their identity to other parts of the ecosystem. That way, efforts to compromise the integrity of the system using spoofed devices or so-called “man in the middle” attacks would have a high bar to clear. For example, the researchers propose a pairing process between the wearable device and the companion mobile application in which both sides present a “visual representation of the same token so that the user can verify a match. Once the user confirms a match, each device stores the other’s identity,” the report notes. Firmware updates to the device are cryptographically signed to prevent malicious updates.
Trust problems with wearable devices also extend to third-party firms participating in the wearable ecosystem, including advertising providers. Furthermore, With the WearFit, the trust relationship is explicitly built and verified with partners by making correct use of certificates, pre-shared keys transmitted over secure channels, and binding legal contracts, the report says.
Protecting the system against attacks against data was a top concern, West said, given that data is almost certainly the avenue that a malicious attacker would take to compromise the device. “From a design standpoint, this is incredibly important,” he said. “You will have folks touching and interacting with data, so you must have a design from the beginning that codifies different types of data and how it should and shouldn’t be treated. That’s paramount.”
[Read more Security Ledger coverage of wearable computing here.]
Still, West emphasized that the design was merely a thinking exercise and that the group’s plans never made it past the whiteboard. In some cases – such as with the assumptions about the hardware running on the fitness device – the WearFit group made decisions to prioritize security over factors like wearability or cost.
“We were at the aggressive end of the hardware spectrum with regard to what was possible to put onboard,” West said. “We wanted to underscore what good crypto looks like and that meant more powerful hardware than might otherwise be used,” he said.
Even though the WearFit will never see the light of day, West said the exercise should be valuable to would-be wearable technology makers. Among the most interesting conclusions was the need for more security and design talent to inform the wearable technology sector. “You need smart people getting together and understanding the system they’re building and what the security requirements of the system are,” he said. “Software has little to do with it. You need people to solve the problem.”
Security threats against wearable computing devices are rare – but not unheard of. In 2013, researchers at Florida International University found evidence of exploitable holes in the Fitbit device, the companion web-based software and the communications protocol used to exchange data between the two was an example of the “careless integration of health data into social networks” that was “fraught with privacy and security vulnerabilities.” November, for example, there were reports that body cameras used by law enforcement and the military shipped with the Conficker virus.