The Lesson We Should Learn From WannaCry

WannaCry? More like, WannaDie. Photo: Simon Dawson/Bloomberg via Getty Images

The WannaCry ransomware — which overtook hundreds of thousands of computers at hospitals, universities, and telecommunications companies around the world this weekend — marks the arrival in the present of a terrifying and once-distant future. Over the last few years, aided by the development of instantaneous and relatively difficult-to-trace block-chain payments, ransomware has grown in power, with occasional one-off attacks hitting hospitals and governments, and even private individuals unfortunate enough to make an ill-fated click. It was only a matter of time before someone combined it with the self-propagating powers of a computer worm. For years, cybersecurity experts have been preaching the necessity of strong security across the web and throughout consumer technology — not just on servers and hard drives where vital records are stored, but also on every other part of the network. WannaCry is why.

Ransomware is malware that encrypts all of the contents on a computer’s hard drive, locking them behind a password. Also included is a time limit: Pay up soon or the files are gone forever. Ransomware is chillingly effective because it takes advantage of structural and beneficial features of the internet. Widely available strong encryption makes it virtually impossible to brute-force a key to unlock the files — that is, to guess a password, using automation to try several times a second. Complex and interconnected systems like those that serve hospitals or utilities companies are vulnerable on several fronts, and are difficult to maintain and secure because of that very complexity and interoperability.

Most nerve-racking of all, WannaCry was enabled by the very security apparatus that seeks to prevent such attacks, the National Security Agency. The NSA hoards software exploits and vulnerabilities that it discovers, and develops tools to weaponize those exploits against its enemies. But in order to keep those exploits available, it is in the NSA’s best interest to not disclose them to software manufacturers like Microsoft, whose Windows operating system is at the center of the recent malware attack. In this case, the incentives of the NSA are aligned with ransomware coders. This is not a great setup.

Technically, the intelligence community has a process for deciding what to disclose to the public. It is known as the Vulnerabilities Equities Process, though the process’s inner workings are still kept secret. But it’s a Catch-22. By informing companies of holes in their products’ security, the government is also closing off vectors from which they can attack enemies. By not doing so, we also leave our own systems open to attack. WannaCry is powered by NSA exploits acquired by a hacking group called the Shadow Brokers (cool name) and made public online. The internet — the great democratizer — turned a tool of the American government into a weapon that anyone, worldwide, could use.

With any luck, WannaCry will be a wake-up call for the intelligence community that the internet is, in most important ways, a borderless network, and that the tools it creates to attack targets can be used for less noble purposes than protecting national security. There is no broadly applicable hacking tool that cannot be repurposed for extortion or malevolence; and absent broad international agreements and powerful consumer protections, stockpiling exploits is a dangerous practice.

In a way, WannaCry’s spread hearkens back to the debate over weak and strong encryption two decades ago — now known as the Crypto Wars. Though public-key encryption systems like PGP were available to normal consumers, strong encryption technology was classified as “military-grade,” and subject to outdated regulations concerning its export to other countries. The U.S. government wanted its people to have the strongest encryption possible, and to not share it with anyone else. The problem was that this inadvertently caused computer users to default to weak encryption. What use was strong encryption if you could only use it with certain recipients? By trying to limit the encryption capabilities of people outside the country, the export controls also limited the security of computer users within the United States. American regulations were no match for the internet’s decentralized distribution network, and realizing the regulations were quickly becoming unenforceable, the government gradually relaxed such restriction throughout the final decade of the 20th century.

It all comes back to the net part of internet. If one part of the chain of nodes and cables is weak, then all of the network is weak. An unpatched computer in Europe can infect another computer in America, propagating automatically, and without human decision-making, across borders. The most nefarious actors will not discriminate in this regard, and the NSA’s tools, while complex, can already be wielded by amateurs. To wit, security researchers combing through the malware have found elements in the code that imply shoddy construction, and poorly considered execution. WannaCry wasn’t the work of methodical experts; one could argue that it was the work of anarchists.

Hoarding security exploits is a zero-sum game. The American intelligence community is still operating under the assumption that it can limit what gets distributed across the internet and around the globe, and that hoarding vulnerabilities isn’t an act of mutually assured destruction. It should disabuse itself of such nonsense.

The Lesson We Should Learn From WannaCry