Some of these announcements even sheepishly admit that their timing isn’t accidental: on May 25, the European General Data Protection Regulation (GDPR) went into effect, and notice to consumers of new data collection practices is the finishing touch as companies around the globe attempt to comply with the new law.
While the GDPR isn’t focused on privacy policies per se, it is a bold and admirable attempt by the European Union to provide individuals with meaningful control over their information.
The GDPR attempts to rectify the power differential between individuals and the entities that collect their data by enacting baseline privacy rights for individuals, along with transparency and accountability requirements for companies. Entities that don’t fall into line with the E.U.’s new regulation can expect big fines. It is a technology- and sector-neutral, comprehensive approach to privacy regulation, and applies to any entity (not just companies) that processes the personal data of individuals in the E.U., whether they are citizens, residents or just tourists.
The law applies regardless of whether the data-collecting entity has any other ties to the E.U. or where the processing takes place. If the entity is based in the E.U., the GDPR may directly protect data subjects all over the world, provided their data is processed in the context of the activities of the E.U.-based entity.
The new law creates or reaffirms a variety of rights for individuals to control their information, how it is processed, and why. These include:
- Mandatory notification for individuals in the case of significant data breaches
- The right to access your data, correct it and delete it
- The right to receive your data and take it to another controller (data portability)
- The right to withdraw consent at any point
If an entity—such as a bank that reaches out to customers via an app—relies on customer consent to collect data, that consent must be freely given, specific to the purpose of collection, informed and unambiguous. That is a much higher standard than most clickwrap agreements (“Click Here to Accept the Terms of Service”) would meet.
Other important individual rights concern automated decision-making, such as when a company or government entity uses algorithms and other aspects of AI to determine government benefits, creditworthiness or employment eligibility. The GDPR gives an individual the right not to be subject to automated processing and profiling that produces “a legal effect concerning him or her” unless a few restrictive conditions are met. And the entity using the information must explain the logic behind that automated processing.
The GDPR also creates more direct requirements for those collecting and processing a person’s information. A company, for example, building a product that collects data must consider possible privacy risks and strategies to mitigate them, both when that product is created and when the data are processed. Certain companies may be required to appoint a data protection officer and conduct privacy impact assessments on their use of data, as well as maintain appropriate security measures and records of how personal information is processed.
Regulators and companies worldwide will be keeping a close eye on the influence and efficacy of the GDPR. Some global companies, like Microsoft, already have announced their decision to extend the new GDPR protections to all users, regardless of jurisdiction. Others may choose to follow the E.U.’s lead, rather than deal with the hassle of fragmented data use guidelines across the globe or risk the bad publicity of keeping a privacy double standard: strong privacy protections for E.U. users and substandard protections for everyone else.
Some of the new GDPR principles already have been enacted in at least 10 countries outside of Europe, according to Graham Greenleaf, a professor of law and information systems at Australia’s University of New South Wales.
What does all of this mean for the U.S.? For starters, the E.U.’s new data privacy regulation might shine a light on the U.S. privacy status quo’s flaws, which force users to agree to a site or app’s terms but offer no meaningful control or recourse when data breaches leave them vulnerable. In the U.S., privacy is protected by a splintered array of sector-specific laws that treat it as a good or a privilege (placing the onus on individuals to decide its value and conduct their affairs accordingly) rather than a right deserving of baseline guarantees.
In contrast, the GDPR treats privacy as a right that must be respected and addressed. The new law’s codification of privacy by design could be worthy of emulation, given that the design of data-collection technologies is integral to how users think and act. Product designers and engineers often treat privacy as an afterthought in the rush to market their software.
If individuals have a right to data portability, allowing them to remove their information from one site or app and bring it to another platform, that could set the stage for a more competitive marketplace in lieu of the oligopolistic mega-platforms that users have too many reasons not to leave. The GDPR is an opportunity for U.S. privacy law to take a careful look in the mirror—and if it doesn’t like what it sees, maybe now’s the time to do something about it.