Skip to main content

What the Apple versus FBI Debacle Taught Us

Legal wrangling in Congress and the courts over data encryption is opening the door for new approaches to managing our personal data

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


The December 2015 terrorist attack in San Bernardino, Calif., continues to be felt around the world in many ways—not only as a brutal act of terrorism but also for bringing the ongoing struggle between digital privacy and security to an inflection point.

Although the FBI dropped its demand that Apple unlock the iPhone used by one of the attackers after the Bureau managed to get into the phone on its own, the faceoff between law enforcement and one of the world’s largest tech companies remains largely unresolved. A number of ongoing court cases and law enforcement investigations involving locked iPhones—as well as a plethora of encrypted messaging platforms now available—serve as a proxy for the larger conflict pitting society’s demands for protection from crime and terrorism against its need to retain some measure of personal privacy in our digital lives.

The gloves are off. The U.S. government is now pushing new legislation in response to the Apple–FBI impasse that would to give law enforcement greater access to information—regardless of whether it is encrypted—in the name of public safety. Congress seeks to ensure that no company is exempt from complying with a court order requiring it to help law enforcement, even if that means decrypting customer information. The U.K.’s Investigatory Powers Bill—nicknamed the Snooper’s Charter—has similar goals.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Such efforts have inspired some companies to double down on data encryption. Facebook-owned WhatsApp recently announced it is providing end-to-end military-strength message encryption for its 1 billion monthly active users, a move that just pushes these two sides further apart. Even Google, a company that lets advertisers run amok throughout the online services they offer their customers, has begun to emphasize automatic data encryption on smartphones running its Android mobile operating system.

The wider implication of FBI v. Apple is the need for a separation of custodial responsibilities for those handling personal data. To emulate the physical world, where property rights and their legal frameworks are well established, think of your phone as a bag bought from the manufacturer. If you wish to borrow my bag or look at its contents, it is based on my terms. Break those terms and I could take my bag back or sue you. This is possible because custodial (property) rights can be treated the same as physical property rights in law, in that they are ‘super rights’ where the holder of that right can assign access and other rights of the belongings to others.

Firms such as Apple that ensure that no one, not even their staff, can get into your data if you don't let them are keen to hand over data custodial rights to their customers and then raise their hands when the Feds turn up to say ‘get a court order’ or ‘ask the user, the data is theirs’. For Apple security is a service provided without rights of access—the lock maker’s mantra. We still don’t know whether this argument would stand up in court because the FBI dropped the San Bernardino case, and there’s no case law that conclusively says whether data can be treated as property.

Google, Facebook and most other big tech companies don’t typically give customers exclusive custodial rights over their data. As threats to their security and reputations mount, however, these companies are starting to change their tune. In the case of Whatsapp, Facebook still retains customer metadata that can be used for ad targeting but has essentially locked itself out of customer messages. As with Apple, it’s unclear whether this approach shields Facebook from having to unlock encrypted Whatsapp messages if the government demands it.

All of this uncertainty opens the door to new types of services that allow people to store personal information apart from the Apples and Googles of the world. The HAT Foundation’s “hub of all things” project, for example, offers the equivalent of a personal Internet data container—sometimes referred to as a microservice container—and lets people set the terms for how businesses, Web sites and others online services contribute to and access this data.

As we move towards a world of Internet of Things (IoT), where connected devices with embedded sensors communicate directly with one another, it is likely the market will be pluralistic in terms of 'custody' of personal data. The likes of Google and Facebook will always want to retain it, but many small companies making IoT devices might prefer not to take the privacy or reputational risk and opt for user containers with access rights back as an alternative. Legal frameworks will have to deal with an increasingly complex issue of possession, use and disposition for data at rest and in use.

The container model maybe unproven, but it suggests a way forward in which Internet companies and the customers they rely on for data can better align their objectives for everyone’s benefit.

Instead of painting the future in which we can have one at the expense of the other, tech companies and law enforcement should be focusing more on new models of data protection and access that build rather than erode trust in the system.