Skip to main content

Privacy, Europe's Newest Luxury Export

The E.U.’s new rules aim to make companies treat your privacy as a right rather than a good to be bargained with

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


Dear Valued Customer—notice anything unusual about your inbox lately? In recent weeks, people around the globe have been receiving a flood of similarly bland and placid assurances with an accompanied nudge. “We value your trust in us, so we’ve made some changes to our privacy policy—please accept to continue using our service.”

Some of these announcements even sheepishly admit that their timing isn’t accidental: on May 25, the European General Data Protection Regulation (GDPR) went into effect, and notice to consumers of new data collection practices is the finishing touch as companies around the globe attempt to comply with the new law.

While the GDPR isn’t focused on privacy policies per se, it is a bold and admirable attempt by the European Union to provide individuals with meaningful control over their information.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


In the digital age, companies govern privacy largely by providing users with notices about how their data will be used. The only choice that users have if they disagree with those practices is to not use the product, or to use only a limited version of it. If that sounds like a one-sided bargain, that’s because it is. As the overwhelming inundation of privacy policy updates may illustrate, reading every applicable policy isn’t a feasible method for most people to understand, much less control, how their information is used. Relying on boilerplate notices no rational person takes the time to read isn’t privacy regulation, it’s data duck and cover.

The GDPR attempts to rectify the power differential between individuals and the entities that collect their data by enacting baseline privacy rights for individuals, along with transparency and accountability requirements for companies. Entities that don’t fall into line with the E.U.’s new regulation can expect big fines. It is a technology- and sector-neutral, comprehensive approach to privacy regulation, and applies to any entity (not just companies) that processes the personal data of individuals in the E.U., whether they are citizens, residents or just tourists.

The law applies regardless of whether the data-collecting entity has any other ties to the E.U. or where the processing takes place. If the entity is based in the E.U., the GDPR may directly protect data subjects all over the world, provided their data is processed in the context of the activities of the E.U.-based entity.

The new law creates or reaffirms a variety of rights for individuals to control their information, how it is processed, and why. These include:

  • Mandatory notification for individuals in the case of significant data breaches

  • The right to access your data, correct it and delete it

  • The right to receive your data and take it to another controller (data portability)

  • The right to withdraw consent at any point

If an entity—such as a bank that reaches out to customers via an app—relies on customer consent to collect data, that consent must be freely given, specific to the purpose of collection, informed and unambiguous. That is a much higher standard than most clickwrap agreements (“Click Here to Accept the Terms of Service”) would meet.

Other important individual rights concern automated decision-making, such as when a company or government entity uses algorithms and other aspects of AI to determine government benefits, creditworthiness or employment eligibility. The GDPR gives an individual the right not to be subject to automated processing and profiling that produces “a legal effect concerning him or her” unless a few restrictive conditions are met. And the entity using the information must explain the logic behind that automated processing.

The GDPR also creates more direct requirements for those collecting and processing a person’s information. A company, for example, building a product that collects data must consider possible privacy risks and strategies to mitigate them, both when that product is created and when the data are processed. Certain companies may be required to appoint a data protection officer and conduct privacy impact assessments on their use of data, as well as maintain appropriate security measures and records of how personal information is processed.

Regulators and companies worldwide will be keeping a close eye on the influence and efficacy of the GDPR. Some global companies, like Microsoft,

already have announced their decision to extend the new GDPR protections to all users, regardless of jurisdiction. Others may choose to follow the E.U.’s lead, rather than deal with the hassle of fragmented data use guidelines across the globe or risk the bad publicity of keeping a privacy double standard: strong privacy protections for E.U. users and substandard protections for everyone else.

Some of the new GDPR principles

already have been enacted in at least 10 countries outside of Europe, according to Graham Greenleaf, a professor of law and information systems at Australia’s University of New South Wales.

What does all of this mean for the U.S.? For starters, the E.U.’s new data privacy regulation might shine a light on the U.S. privacy status quo’s flaws, which force users to agree to a site or app’s terms but offer no meaningful control or recourse when data breaches leave them vulnerable. In the U.S., privacy is protected by a splintered array of sector-specific laws that treat it as a good or a privilege (placing the onus on individuals to decide its value and conduct their affairs accordingly) rather than a right deserving of baseline guarantees.

In contrast, the GDPR treats privacy as a right that must be respected and addressed. The new law’s codification of privacy by design could be worthy of emulation, given that the design of data-collection technologies is integral to how users think and act. Product designers and engineers often treat privacy as an afterthought in the rush to market their software.

If individuals have a right to data portability, allowing them to remove their information from one site or app and bring it to another platform, that could set the stage for a more competitive marketplace in lieu of the oligopolistic mega-platforms that users have too many reasons not to leave. The GDPR is an opportunity for U.S. privacy law to take a careful look in the mirror—and if it doesn’t like what it sees, maybe now’s the time to do something about it.

About Lindsey Barrett and Gabriela Zanfir-Fortuna

Lindsey Barrett is the Georgetown Policy Fellow at Future of Privacy Forum. She received her BA from Duke and her JD from Georgetown. Her work has previously been published in the New York Review of Law and Social Change, the Georgetown Law Journal and the Georgetown Law Technology Review, of which she was the managing editor and co-founder. Gabriela Zanfir-Fortuna, PhD, is policy counsel of the Future of Privacy Forum and affiliated researcher of LSTS, Vrije Universiteit Brussels. She worked in Brussels for the European Data Protection Supervisor and the Article 29 Working Party, after obtaining her PhD from the University of Craiova with a thesis on the rights of data subjects

More by Lindsey Barrett and Gabriela Zanfir-Fortuna