In 2017, I wrote:a "So here we are, 70 years into the computer age and after three ACM Turing Awards in the area of cryptography (but none in cybersecurity), and we still do not seem to know how to build secure information systems." What would I write today? Clearly, I would write: "75 years," but I would not change a word in the rest of the sentence. In fact, one could argue that the cybersecurity threat has increased, as critical infrastructure is now vulnerable to cyberattacks. Indeed, in May 2021 the U.S. oil pipeline system Colonial Pipeline came under a ransomware attack that forced it to halt all pipeline operations to contain the attack.
The slow progress in cybersecurity is leading many to conclude the problem is not due to just a lack of technical solution but reflects a market failure, which disincentivizes those who may be able to fix serious security vulnerabilities from doing so. As I arguedb in 2020, the computing fields tend to focus on efficiency at the expense of resilience. Security usually comes at a cost in terms of performance, a cost that market players seem reluctant to pay.
To discuss the market-failure issue and how to address it, the Computing Community Consortium organized in August this year a visioning workshop on Mechanism Design for Improving Hardware Security.c The opening talk was given by Paul Rosenzweig, an attorney who specializes in national security law. He argued that technological development is founded, at the end, on human behavior. So, the key to good cybersecurity is to incentivize humans. Thus, the answer lies in the economics of cybersecurity, which is, mostly, a private domain with lots of externalities, where prices do not capture all costs.
One such glaring externality is the lack of accountability in the computing marketplace. Whenever we use a computing system, we must consent to a click-through license that almost always include language such as "To the extent not prohibited by applicable law, in no event shall XXX be liable for personal injury or any incidental, special, indirect, or consequential damages whatsoever." Since computing is rarely covered by "applicable law," it follows that computing is not being covered by the standard rules for strict liability, which does not depend on actual negligence or intent to harm. As the philosopher Helen Nissenbaum pointed out in a 1996 article,d while computing vendors are responsible for the reliability and safety of their product, the lack of liability results in lack of accountability. She warned us more than 25 years ago about eroding accountability in computerized societies. The development of the "move-fast-and-break-things" culture in this century shows that her warning was on the mark.
A typical response from the tech industry to such complaints is to wave the consent flag. "You clicked through the license, so you accepted the terms," they say. "So, what are you complaining about?" But this argument is a red herring! A contract where the parties are of such disproportionate bargaining power that the party of weaker bargaining power could not have negotiated for variations in the terms of the contract is known, in legal terms, as an "adhesion contract." Courts have a long history of striking terms from such contracts or voiding the contract entirely when they determine the terms to be especially egregious to standards of fair play. In my opinion, such prevailing waivers of liability, which may have been appropriate when computing was young and all computing systems could have been viewed as experimental, ought to be considered today as especially egregious to standards of fair play.
It is not clear to me why the legal system has yet to address this glaring externality in the computing marketplace. Since it has not, it is time to address it using laws and regulations. There is a long tradition of the law imposing strict liability on vendors. As Nissenbaum pointed out, almost 4,000 years ago, the Hammurabi Code stated: "If a builder has built a house for a man and has not made his work sound, and the house that he has built has fallen down and so caused the death of the householder, that builder shall be put to death."
The tech industry has been traditionally hostile to regulation. "Regulation stifles innovation," is the refrain. But innovation is not an end, innovation is a means. The end was declared by Hammurabi to be "to further the well-being of mankind." If we want to address the cyber-insecurity issue, we should start by welcoming liability into computing.
This article originally appeared in Communications of the ACM.