Skip to main content

Is Privacy Dead? Technological Approaches to the Technological Threat

In this episode Carnegie Mellon University computer scientist Latanya Sweeney talks about the changes in privacy due to data collection and approaches to protect privacy in the future, with Scientific American contributor Chip Walter. Plus we'll test your knowledge of some recent science in the news. Websites mentioned in this episode include privacy.cs.cmu.edu; www.chipwalter.com

Uncertain

Welcome to Science Talk, the weekly podcast of Scientific American for the seven days starting August 1st. I am Steve Mirsky. This week on the podcast:

Sweeney: We began, sort of, our data detective mission: "Could you go out on the Internet? How much information can you find about people?" And one of the easiest targets is actually college-age people.

Steve: That's Latanya Sweeney from Carnegie Mellon University talking about computer privacy and privacy in general. She is our guest this week. Plus, we'll test your knowledge about some recent science in the news. Latanya Sweeney is an associate professor of computer science, technology and policy at Carnegie Mellon University in Pittsburgh. She is the founder and director of that institution's Laboratory for International Data Privacy and she is considered to be one of the nation’s leaders in creative ways to protect privacy in an increasingly porous world. She was recently interviewed by Chip Walter on a program called, The World Revealed, broadcast on WRCT, the radio station of Carnegie Mellon University. The show features long-form interviews with scientists and others talking about how the world works. Chip was a guest on the SciAm podcast on January 3rd, when he talked about his most recent book, Thumbs, Toes and Tears—alook at how the evolution of six physical traits that are unique to humans shape who we are. Thumbs, Toes, and Tears is a Scientific American book of the month club selection. Chip also contributes to Scientific American Mind and Scientific American magazines. He profiled Latanya Sweeney in the July issue of Scientific American. Here is an edited version of the conversation that Chip and Dr. Sweeney had on WRCT.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Walter: You know there is a famous or infamous quip, you know, that Scott McNeilly who was the founder and is the CEO of Sun Microsystems made a few years ago, and he said, "Privacy is dead, get over it." You know, is privacy dead?

Sweeney: Well, let's say it is critically wounded, but it still has life—but more importantly we can't afford for it to actually die. It is essential for us to be able to maintain privacy. We may not be able to continue it the way we knew it, that

'sis the world 20 years from now is probably not going to have individual privacy the way we had it as childrenso,or 20 years earlier, but there are some fundamental things about privacy and fairness that have to be restored and have to be protected.

Walter: So what has happened, you know, what's really changed that's made it much more difficult to hang onto our privacy?

Sweeney: Well there are three things, I think, that really came into play. One is technology. So in some sense one of the best protections we had about on privacy 15 to 20 years ago wasn't actually the laws and practices, but it was the absence of technology as we know it today. Now data is so ubiquitous and getting a copy of it and transferring it around the world is just a matter of seconds.

Walter: Well, one thing you found right is that we kind of leak information and don't realize we leaked it.

Sweeney: That's true.

Walter: You know, it kind of proliferates and travels.

Sweeney: That's true and that sort of gets to the second cornerstone or sort of why things have really changed and it has to do with the fact that there are these data collections everywhere. It's just cheaper to collect data, it doesn’t really cost anything anymore; the cost of

this[these] large storage spaces is sort of approaching zero and so it is very easy to capture the information and then just keep it. One of the way[s] that's played out on our policies though has also been that whenever we faced national problems, our answershas been to turn to data collections, so we've actually on a federal and government level actually sort of pushed the ball faster against privacy by reacting to public problems by saying, "we don't know, the answer was [to] start a data collection.' So I'll give you some examples. You may remember the "Deadbeat Dads problem" as it was called—when parents were being financially responsible for their children. When that got to the congressional level, Congress didn't really know how to solve it, so instead they started a new database called the "Database of New Hires" and all of us were in it, whether if you work, whether you have children or not and the information is automatically taken from payroll information and given to the database.

Walter: Great and so you're not saying that certainly the intent was to invade our privacy, but the result is that it makes it easier for our privacy to be invaded.

Sweeney: Right. And then public sentiment, you know, is also, I think, also has pushed the ball, too, and that is that whenever, if the answer to an immediate need from us was basically more data or some privacy-invasive technology, we tended to elect to have it. So coming into from the 80s and 90s was this idea that (unclear 4:56) video in high-crime areas or putting video in lots of places would make us feel safe or better and so we tended to opt for it. Closed-gate communities opted for more security around those communities and to live in that environment, so by the time 9/11 came and all of a sudden, sort of in some sense the country goes into a crisis, you say, "Oh my God! Of course we want to be safe." The idea that we would give up our privacy in those moments, whether we perceived it as a short-term or even maybe to some extent long-term, is the way that Americans had been making decisions, but now sort of the pendulum swings the other way and we are left with the consequences of these privacy decisions and how can we actually keep the democracy alive

andin the face of so much privacy invasion.

Walter: It seems like what I am hearing is we've elected kind of naively to proliferate all this information about ourselves or to gather information, you know, not necessarily in a "Big Brother" sort of way, but nevertheless it's there and therefore people who maybe we don't want to get this information are savvy enough to find it, can. So you have had programs you developed like that will scrub this information out of databases at hospitals for example or get the identification software, and you have something called "k-anonymity" and I don't know, you had one that got a lot of attention called "Identity Angel." Maybe you could start with that one and then talk a little bit about some of the things you are doing at the lab.

Sweeney: Sure. What we do in the lab is we tried to look at technologies that sort of are directly aimed at privacy problems that society is having and then tried to look at the technology and the problem in sort of a new way and develop new technologies or technology add-ons. This is sort of a...

Walter: …kind of like a data detective agency?

Sweeney: (laughs) Yeah! Like a data detective agency. Yeah!

Walter: Right! You are gumshoe.

Sweeney: Right. And to introduce sort of solutions so the society can enjoy the new technology, its benefits, and at the same time have privacy. So, one of them that we became very interested in was Identity Angel. We had been looking at some statistics coming out from the Federal Trade Commission that showed an increase in credit card thefts, and they showed statistics that tended to show at that time people who were basically college-age were among the highest victims. And being in a college, university, we took quite a lot of note[s] about that:Why is that happening? How could that be? And we began, sort of, our data detective mission and sort of said, "well could you go out on the Internet, how much information can you find about people?” And one of the easiest targets is actually college-age people because they are used to having a lot of information about them; they've exchanged a lot of information about themselves online. They have Web pages or Facebook pages and what have you.

Walter: Right, right.

Sweeney: And so as a result there is a lot of information about them and so the question

swas, "How easy would it be to impersonate them in credit card fraud?" And so then, when looking at what it takes to get a credit card in someone's name is scary. It doesn't take very many pieces of information and so the question was "could we write a program that could go mine the Web looking for information, so that if it found the right combination of information on any individual that is enough to get a credit card in the person's name. It could then figure out how to alert the person and let them know that they are vulnerable and which information they might want to change or move or modify.

Walter: And so what did you find whatever, I mean you found that it was pretty easy, too...

Sweeney: Yeah! We found we were expecting to find few, but we found tens of thousands of ways; tens of thousands of instances of people's credit card information sufficient to get credit cards in their name.

Walter: Right. So you kind of played the bad guy, in a sense, you know, you went out and gathered the information about these people relatively easily. Tens of thousands of people and you didn't steal their identities but you could have; and didn't want to...

Sweeney: Right and it's funny that you said it that way because, so the way the program works is once it finds enough information about you that you could get a credit card in your name, it figures out whether or not it can e-mail you and then it sends you an e-mail message. So the funniest or the first couple of hundred of people who actually got messages from us was people who would turn around threatening to sue Carnegie Mellon saying, "Oh my god! You are stealing our identity," and the people really did not understand at the time, you know, what was going on.

Walter: That you were being the good guy.
Sweeney: Yeah, that we were being the good guy.
Walter: So, "Hey, we are just trying to help you out here?"
Sweeney: Yeah!

Walter: Are we just being a little naive and assuming that the world was the way it used to be? I mean, I've got two teenage daughters and they've got Facebook pages and, you know, they go on, "Oh no! Dad you don't have to worry about it," but we really should.

Sweeney: Yeah! You definitely should because in some sense you have to kind of think about how much information was documented on every minute of your life. Now for most people who are adults, you know, the times that you skipped school and the times you didn't eat your lunch in the lunch room cafeteria and the time you stuck your finger in your nose inappropriately when nobody, you thought, no one was looking, all those incidents and many more serious ones did not get recorded. And suppose you, even as a young adult, had found yourself in financial trouble and then ended up declaring bankruptcy and then after your tenure wait period you sort of got new credit again or credit opportunity again. Those were all things that end for older adults; that is the life that we know. But for today you have to think in terms of, "Oh my god!" In some sense, there is a document of almost every minute of a child's life—from the time they are born to the time they die—and there is no forgetting and there is no forgiving.

Walter: So you could find out, even though your credit record may be cleared after 10 years after bankruptcy, it would still be possible somewhere it's recorded, it's available that...

Sweeney: That's right, and somebody looking to give you big credit isn't necessarily always obliged to forget that tenure period. They can go back now in a way that they couldn't before.

Walter: Right, whereas in the old days, it would just be sitting in some vault on a piece of paper that would have been a much more difficult to get your hands on.

Sweeney: Yeah! I mean I think the best way I would think the best analogy I have for sort of describing the shifting that's happening is sort of that old saying that you used to hear "go west young man." And because the "go west young man" was a saying that sort of said if you messed up on the east coast, you could go to the west coast and start over again. Well, even by the time people who were older adults, by the time we were growing up, that was no longer quite true, but there were still remnants of this notion of you sort pay your penance and then you go on in life. But nowadays there is no west coast for us, for the generation coming from us. Yeah!

Walter: There is no escape. Yeah, there is no escape anywhere you go, your information can follow you.

Sweeney: Yeah!

Walter: When you were talking about video surveillance, that sort of thing, it, you know, reminded me of another project that you worked on, which was real interesting and basically this came out of work, I think, you were doing for the federal government. It was about video surveillance, but the idea was to hide the identity of the people that were being, you know, watched and recorded, so tell a little bit about that.

Sweeney: Well, I think there are many examples of this, but let me give you one quick example to think about it, is what I would call law enforcement "catch-22" problem. So there is a group of people say in a room like we are here, lets say some incident happened out in the hallway and Carnegie Mellon had a videotape angled on us and so when the police come, they know that the videotape couldn’t possibly show the incident in the hallway, but and so Carnegie Mellon says, “Well I’m not going to give you the videotape of the people who are in the room because that’s irrelevant” and the law enforcement says, “Oh no! That’s relevant because what if somebody was giving a signal to somebody out in the hall, what if somebody was looking out in the hall at the time the incident happened, we’d have a witness.”

Walter: Right!

Sweeney: So now law enforcement is really on a catch-22. On the one hand, if they could get the video, they'd know whether it was useful and could reap benefits from it, but they can't get video because of sort of a privacy violation that they otherwise wouldn't have. So how do you generally solve this problem, that is more generally as video cameras are being placed in lots of public spaces and at the same time, you don't want people being tracked by matching up on their identities to their driver's license, so that you don't want the situation where law enforcement, if face recognition worked perfectly, could literally track people throughout public spaces all the time.

Walter: Right!

Sweeney: So how do you restore some of this need for a search warrant before something goes wrong, they only need a search warrant? So our solution was what we call "face de-identification." That's given a video clip, we basically take the images, the faces out of the video clip, we detect them in real-time. We sort of peel off the face, produce an average face and then we morph that face back into the video, so when you look at the video, you see people having the same expressions they had before doing the same activities they were doing except the face that used to be on the shoulders aren't their faces.

Walter: Right, right. So you are actually seeing the face of someone that doesn't exist, anywhere.

Sweeney: That's right. And then we can prove that if you did have, you know, a complete driver's license, database and everyone was in it, you still wouldn't be able to confidently identify the person.

Walter: Right! Face recognition by software would not [be possible] but if there was a crime or there was some strong reason why you would need to see the real face that was there, it would be possible to see it.

Sweeney: That's right.

Walter: But [it] seems like one of the things you are saying is happening is that the more information that's out there, the more law enforcement also will sort of gather it in and say, "well, you know, better safe than sorry,"so we may as well have it, but that compromises our own personal privacy.

Sweeney: While it's not to say that all of us Americans are running around, doing illegal activities and doing something wrong and therefore we have something to hide. It's not that it's the secondary use of the data. That is, I may not mind so much law enforcement watching me completely in public space, but I don't have a way to control it or limit it there. It could then be used for something else later. For example, I suppose I will get a discount on my medical insurance or my medical insurance premium gets set based on how much exercise they think I do, so they might, you can imagine and we could think in terms of in the future, automated technology that would take in these images and try to figure out or calculate how much activity do you have and then be able to report that since you now control the data, will be able to report it to your insurance company. That's a hypothetical example, but it is an example of how data that you may not feel too comfortable about letting go off at one point can come back around and really hurt you. I'll give you another example

would be: loyalty cards, like at the supermarket.

Walter: Right, right.

Sweeney: So, you know, when we all go to [a] local Giant Eagle here in Pittsburgh—it's our big chain—and we use our loyalty card and we get cheaper groceries. And you say, "Well I don't care if they know what I purchase and so forth." But there have been some really interesting issues that have already come up from the use of loyalty card information. There was one situation where they tried to use or correlate the amount of junk food that you purchase to your medical claims data and the insurance and also there was another one correlating junk food to absenteeism at work, trying to find out whether or not there were some of these patterns.

Walter: You know there are think tanks, all kinds of organizations out there that deal with computer security and privacy and those sorts of issues. What makes your lab different?

Sweeney: Well, like you said, there are many people who do work in computer security. One of the differences between computer security and sort of the privacy work that I do, computer security attempts to say

, I am going to,that there is some information that you have in a vault and I am going to make a vault electronically so that only the person who is authorized to get it can get it and there[y] are going to authenticate themselves that they are person they say they are and that they have the rights to access the data and so forth. The kind of problems that we've been talking about are those data freely given away. It is not in a vault, I don't have control over it, but in some sense I have to give it in order to exist in society or in order to get the discount at the store and so the real question then is: "How do you solve the problem in that space?" You know Ron Rivest of RSA Fame—the "R" of the RSA Fame—had a great saying to me once. He said, "You know, the problem with trying to solve these problems with laws is that laws move as a function of years and technology moves as a function of months." And that's really true, and so I think the long-term view of our work has got to be that computer scientists and engineers who are developing new technologies have to learn to do this in the technologies they build. I mean, that's really our long-term promise, because it's so much cheaper and easier for society if the new technology rolls out with the privacy controls in them and then society can decide which one to turn off or on. But historically we've built technology only from the you standpoint, some new interesting news, some new use of new ability and we rush to push it out the door. Scott McNeilly, I would argue, is sort of in that camp in the sense that Sun at that time primarily made database systems. The idea that people were concerned about from a privacy standpoint, the amount of information going into databases, and he wanted access to databases somewhat ubiquitously available is a little, is slight taking your technology and running really fast until, "Well forget privacy, it's dead; but get out of the way, you know there is a train coming."

Walter: We've got a job to do here.

Sweeney: But I think you know, since then many people who recognize that—well you know what, we can do the same things, but we can put some privacy controls on them as well.

Walter: Perhaps you can tell us where people can learn more about your work and more about the lab. You have a...

Sweeney: I’ll give you a quick URL.
Walter: Okay.
Sweeney: It's privacy.cs.cmu.edu

Walter: Okay. All right. And so anyone [who] wants to know more about what Latanya is up to, check that out and once again, we've been talking with Latanya Sweeney, she is founder and director of the Data Privacy Lab at Carnegie Mellon University and she is considered one of the nation's leading experts in creative ways to protect privacy.

Steve: The profile of Latanya Sweeney by Chip Walter is available free on our Web site. Just go [to] www.SciAm.com, hit the link for magazine and go to the insight section of the July issue. The article is entitled, "A Little Privacy Please." We'll be right back.

Male announcer: Wandering around? Visit Scientific American mobile edition on your Web-enabled mobile device. Go to wap.SciAm.com for the latest science news and analysis, plus daily trivia questions. That’s wap.SciAm.com on your mobile's browser.

Steve: Now it's time to play TOTALL…….Y BOGUS. Here are four science stories, only three are true. See if you know which story is TOTALL…….Y BOGUS.

Story number 1: In a British University proof of concept project, researchers built a car out of old magazines and newspapers.

Story number 2: Two M.I.T. students have come up with a plan for harvesting the motion of people in crowds to generate electricity.

Story number 3: Four people were killed last week when Iraqis celebrated their country's first Asian Cup soccer championship by shooting guns in the air.

And Story number 4: A replica of an articulated false toe found on a mummy in the Cairo Museum will be tested to see if it's functional, which would make it the oldest functional prosthetic yet known.

Time is up.

Story number 4 is true. British researchers will test a replica of the mummy's toe on volunteers missing their right big toe to see if the device aids in walking. If it does, it would lend credence to the idea that the toe is the oldest known functional prosthetic. For more, check out the July 30th episode of the daily SciAm podcast 60-Second Science.

Story number 3 is true. Four Iraqis were killed [and] at least 17 were wounded by celebratory gun fire after the soccer victory. Until guns and rifles are developed from which bullets can achieve escape velocity and reach orbit and not be brought back to the earth's surface by gravity, sports fans around the world, you've got to stop shooting off guns and rifles when your team wins something.

And Story number 2 is true. M.I.T. students have developed a plan for what they call "crowd farming," to generate electricity from human motion. In a train station for example, a responsive system made of blocks under the floor would be slightly pushed down by the milling masses. As the blocks slipped past each other, they would act like a dynamo, which converts energy of motion into electric current. The idea won the Japanese Wholesome Foundation Sustainable Construction Competition. The students were inspired, they say, by Thomas Edison, who had a turnstile in his house that visitors had to pass through—and when they did, they helped pump water into his holding tank.

All of which means that Story number 1 about a car built out of recyclable magazines and newspapers is TOTALL…….Y BOGUS. Because what is true is that researchers at Warwick University in England built a car mostly out of vegetable products. According to the Australian Herald Sun newspaper, the tires are made from potatoes, the body is made out of hemp, and it runs on fuel from fermented wheat and sugar beets. Despite its vegetative origins the car, which can reach speeds of about 150 miles per hour, is reportedly not a lemon.

Well that's it for this edition of the weekly Scientific American podcast. You can write to us at podcast@SciAm.com, check out news articles at our Web site, www.SciAm.com and the daily SciAm podcast 60-Second Science is at the Web site and at iTtunes. For Science Talk, the weekly podcast of Scientific American, I am Steve Mirsky. Thanks for clicking on us.

Web sites mentioned in this episode include privacy.cs.cmu.edu; www.chipwalter.com

Steve Mirsky was the winner of a Twist contest in 1962, for which he received three crayons and three pieces of construction paper. It remains his most prestigious award.

More by Steve Mirsky
Is Privacy Dead? Technological Approaches to the Technological Threat