Skip to main content

Electronic Contracts and the Illusion of Consent

The ubiquitous "click-to-agree" mechanism presses people to act like machines

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


Q: What do you do when you see a little button on a webpage or app screen that says I agree?

A: Click the button.

The familiar and incredibly simple click-to-agree mechanism is ubiquitous. We encounter it throughout our digital lives. It is nothing less than the “legal backbone” of the internet, app stores, e-commerce and so much more.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Yet electronic contracting and the illusion of consent-by-clicking are a sham.

I was excited to see the editorial board of the New York Times publish “How Silicon Valley Puts the ‘Con’ in Consent” on February 2, 2019. They dispelled the illusion and asked the obvious question: “If no one reads the terms and conditions, how can they continue to be the legal backbone of the internet?” If only they’d provided answers.

I’ll give some below, but first, let me explain where the Times got sidetracked.

In diagnosing the “con,” the editors emphasize privacy. It’s all about the data. Contracts and the illusion of consent by clicking enable surveillance and complex, hidden and varied data flows. The editors argue for “strong privacy protections,” which makes good sense.

Privacy is a necessary thing to talk about, but it’s just a copse of trees. The forest is humanity. As Evan Selinger and I argue in our book Re-Engineering Humanity, especially the chapter titled “Engineering Humans with Contracts,” the more fundamental concern is how the click-to-contract human-computer interface nudges humans to behave automatically, without thinking, like simple machines. Much more is surrendered than hidden data flows. The click-to-contract script is dehumanizing.

People don’t read terms of service, privacy policies or other electronic boilerplate. People rarely stop and think about the parties with whom they’re forming legally binding relationships online.

This not by accident; it is by design.

For the sake of efficiency and convenience and to comply with the minimal requirements imposed by law, designers arrange the digital contracting environment to create a practically seamless, transaction-cost–minimized user experience. Rather than requiring people who intend to use online services to read lengthy pages filled with boilerplate legal jargon—jargon they can’t reasonably be expected to understand and won’t be able to negotiate with anyway—a simple click of the mouse, with conspicuous notice of the mere existence of terms, is enough to manifest consent to enter legally binding relationships.

Our online contracting regime is a compelling example of what Evan Selinger and I call techno-social engineering: legal rules coupled with a specific technological environment can lead us to behave like simple stimulus-response machines: perfectly rational, but also perfectly predictable and ultimately programmable. The environment disciplines us to go on autopilot and, arguably, helps create or reinforce dispositions that will affect us in other walks of life that involve similar technological environments.

The frictionless click-to-contract button and bloated terms are features of a human-computer interface that creeps from websites to apps to smart TVs and eventually to every supposedly smart device we’ll soon see when the internet of things finally arrives. (Aren’t you excited?)

The parties, legal relationships, technologies, services provided, data generated and collected, and implications vary dramatically across these contexts. Nonetheless, our behavior remains more or less the same: perfectly predictable, seemingly rational, and hyperefficient, check the box, click “I agree.”

Individually, just clicking is easily defensible. It’s incrementally cost-benefit justified. Further, clicking-to-consent becomes rational strategy for individuals, a script to perform automatically, given the multitude of encounters.

The current scale and scope of private ordering by written contract is unprecedented. Evan and I haven’t quantified the number of written contracts the average person enters into during her lifetime, but we suspect the number has steadily, if not exponentially, increased over the past half century; the rate of meaningful participation in negotiating terms has steadily decreased; and the number of written contracts concerning mundane affairs has increased, if not skyrocketed. By mundane, I simply mean ordinary, everyday affairs for which a written contract would be cost-prohibitive, inefficient and downright silly in the absence of digital boilerplate.

How many written contracts have you entered in your life? In the past, the answer would be orders of magnitude less than the answer today. Yet future readers may find the question odd because the idea of distinct, identifiable contracts may be at odds with their experience of completely seamless contractual governance. This possibility highlights the basic issue.

Freedom of contract requires the inverse: freedom from contract. When contract becomes automatic and ubiquitous, both disappear. There is no freedom. Genuine autonomy is lost.

Brett Frischmann is the Charles Widger Endowed University Professor in Law, Business and Economics, Villanova University. His latest book is Re-Engineering Humanity (Cambridge University Press 2018). His novel, Shephard's Drone, will be out on February 3, 2019.

More by Brett Frischmann