Say ‘technology monoculture’ and most people (who don’t look at you cross-eyed or say ‘God bless you!’) will say “Microsoft” or “Windows” or “Microsoft Windows.” That makes sense. Windows still runs on more than 90% of all desktop systems, long after Redmond’s star is rumored to have dimmed next to that of Apple. Microsoft is the poster child for the dangers and benefits of a monoculture. Hardware makers and application developers have a single platform to write to – consumers have confidence that the software and hardware they buy will “just work” so long as they’re running some version of Windows.
The downside, of course, is that the Windows monoculture has also been a boon to bad guys, who can tailor exploits to one operating system or associated application (Office, Internet Explorer) and be confident that 9 of 10 systems their malicious software encounters will at least be running some version of the software they’re targeting – even if it isn’t the vulnerable version they’re looking for.
In a now-famous 2003 essay, “The Cost of Monopoly” (archived here at the site Cryptome.org.) Dr. Dan Geer argued, persuasively, that Microsoft’s operating system monopoly constituted a grave risk to the security of the United States and international security, as well.
It was in the interest of the U.S. government, and others, to break Redmond’s monopoly or, at least, to loosen Microsoft’s ability to ‘lock in’ customers and limit choice. “The prevalence of security flaw (sp) in Microsoft’s products is an effect of monopoly power; it must not be allowed to become a reinforcer,” Geer wrote.
The essay cost Geer his job at the security consulting firm AtStake, which then counted Microsoft as a major customer.
These days Geer is the Chief Security Officer at In-Q-Tel, the CIA’s venture capital arm. But he’s no less vigilant of the dangers of software monocultures. Writing today for the blog Lawfare, Geer is again warning about the dangers that come from an over-reliance on common platforms and code. His target this time isn’t proprietary software managed by Redmond, however, it’s the open source OpenSSL software at the heart (pun intended) of Heartbleed.
“The critical infrastructure’s monoculture question was once centered on Microsoft Windows,” he writes. “No more. The critical infrastructure’s monoculture problem, and hence its exposure to common mode risk, is now small devices and the chips which run them.”
For Geer, Heartbleed is symptomatic of larger problem in the (fast emerging) post-Windows era: our increasing reliance on common hardware and software components that, while they may not qualify as ‘critical infrastructure,’ still have the ability to cause mass disruption and impose steep costs on society when things go wrong.
“Heartbleed is instructive,” Geer writes on Lawfare. “Its deployment was not wide enough to be called an Internet-scale monoculture and yet the costs are substantial.”
What if Heartbleed had been an actual monoculture, Geer wonders. What if OpenSSL was not just a popular platform for managing secure communications but the ‘Windows’ of encryption, and the Heartbleed flaw one that affected “not just the server side of a fractional share of merchants but every client as well?”
As with his warning about Windows, Geer seems to suggest that wholesale disruption of the Internet (and, therefore the economy and the government) aren’t out of the question. Components like OpenSSL – distributed widely – create the conditions for what is termed ‘common-mode’ software failure: in which diverse computer systems come to rely on a common element, whether hardware or software.
“The Internet, per se, was designed for resistance to random faults; it was not designed for resistance to targeted faults,” Geer warns. “As the monocultures build, they do so in ever more pervasive, ever smaller packages, in ever less noticeable roles. The avenues to common mode failure proliferate.”
Geer notes endemic problems like the vast population of out-of-date and vulnerable home broadband routers as an example of “an effective monoculture, albeit within a domain that is almost but not quite Internet-scale.” Security firms have already noted successful attempts to harness this massive population of vulnerable (and in many cases unmanageable) devices for use in botnets and denial of service attacks.
Heartbleed raises similar types of concerns, just in a number of areas.
“The Heartbleed problem can be blamed on complexity; all Internet standards become festooned with complicating option sets that no one person can know in their entirety,” Geer writes. “The Heartbleed problem can be blamed on insufficient investment; safety review for open source code is rarely funded, nor sustainable when it is. The Heartbleed problem can be blamed on poor planning; wide deployment within critical functions but without any repair regime.”
The solution? Geer comes down squarely in favor of structural solutions to the problem. The Internet community (which includes both the private and public sector) can work together to retrench and review the security of any commonly used platform. That’s already happening with OpenSSL. The question now is ‘what other platforms might have Heartbleed-type flaws that are yet to be discovered?’ Thats a complicated (but simple sounding) fix. The other options are no less daunting: chasing vulnerabilities out of the software supply chain (an FDA-like approach), or coming up with systems to heal and recover from faults that all but close the window of vulnerability. (This is what is sometimes referred to as ‘autonomic computing‘ and its a work in progress – to say the least.)