Advertisement

Privacy And Civil Liberties In The Face Of Surveillance

46:03
Download Audio
Resume

We’ll talk about privacy and liberty in the age of the Internet and big surveillance with big web thinker Jaron Lanier.

Denise Harwood diagnoses an overheated computer processor at Google’s data center in The Dalles, Ore. Google uses these data centers to store email, photos, video, calendar entries and other information shared by its users. (Connie Zhou, Google/AP)
Denise Harwood diagnoses an overheated computer processor at Google’s data center in The Dalles, Ore. Google uses these data centers to store email, photos, video, calendar entries and other information shared by its users. (Connie Zhou, Google/AP)

The big NSA surveillance story clearly isn't over. It may never be over in this new age in which we live. Our lives, online, sliceable and diceable and traceable.  We are now data.

And metadata.  Huge industries are built on that. So is a whole lot of intelligence-gathering, it appears.

So what is privacy now? What is liberty? What personal boundaries can we claim? Can we protect? Who can we trust to respect any boundary around our personal data? Ourselves?

Big web thinker Jaron Lanier is on it.  He’s with us.

This hour On Point: Privacy and liberty in the age of digital surveillance.
-- Tom Ashbrook

Guests

Craig Timberg, national technology reporter for The Washington Post. (@craigtimberg)

Jaron Lanier, computer scientist, composer, visual artist and author. His latest book is "Who Owns The Future?" He also wrote "The Meta Question: What Is The NSA Doing With Your Metadata?"

Show Highlights

Lanier on the Facebook generation, lost privacy and the danger to democracy:

Facebook somehow convinced the whole world of young people in just a few years to have a different attitude about privacy. And the problem with that is you can abstract privacy and say it's just one little thing and it has nothing to do with your commercial prospects and your career and it has nothing to do with your political standing. But the problem is that it's not a fair kind of loss of privacy because you don't have the giant computer to scrutinize the other people who are scrutinizing you. So the people with the giant computers can analyze more of your data than you can of theirs. So a loss of privacy isn't this generally open and fair thing that it seems to be at first. And that's the problem with it; there really isn't equity.

If you give up your privacy in the way that this generation is doing. First of all, to my view — and my view is controversial, to be clear — it's also having a detrimental effect on their commercial prospects. They say, "I'll give away my data for free," but we're entering more and more of an information economy. And so what they're really saying is "I'll work for free." So we're entering into this world of reduced financial expectations that's actually the flip side of the reduction of privacy.

And that does have an effect on liberty. Once again, it's a slow, creeping kind of an effect. But the more wealth and influence is concentrated in a tiny minority, the more democracy is stressed. This is a problem in our politics already. And the way we're doing things with open information benefiting the people with the biggest computers is only furthering this idea of concentration of wealth and influence. So I do think that what the Facebook generation has been taught is to their detriment in the long term.

Lanier on how the co-evolution of intelligence and commercial technology:

About two decades ago when computers got cheap enough that you could conceivably start to just gather the whole network into a big computer and process it. That's when we saw the birth of the big Internet companies. We saw the birth of all these business models where you gather personal information and try to target people with it.

First of all, the intelligence community was all over Silicon Valley in those years. There was a kind of a co-evolution of intelligence technology and commercial technology. But the thing is that the initial promise turned out not to work. What the initial promise was was that artificial intelligence would be able to automatically find the needle in the haystack — that was the metaphor of choice. Somehow you'd have these algorithms that could just find you the terrorist, find you the kingpin, that sort of thing. And in the abstract, there are a lot of demonstrations that it works, but in practice it's hard. One of the reasons is that it's really easy to create fake metadata, so criminals like the phishers and spammers of the world, the hucksters online use fake data all the time. [TOM ASHBROOK: They would disguise their profile, their whereabouts, their activity.] Yeah, so it becomes a cat-and-mouse game. At the end of the day, this idea of finding the needle in the haystack automatically kind of faded away, and you really don't hear that metaphor anymore.

Instead, what happened in Silicon Valley is a transformation to a really different way of using these types of computers, where what we do is we gather tons of tons of data and then we use statistics to get an edge — a small edge, but a persistent one over time — in being able to put ads in front of people that are a little bit more effective than they might otherwise be. But it's a very imprecise, broad brush kind of a tool.

Lanier on using data after an event vs. using data to predict events:

So we have to distinguish two really different uses of this kind of data. One is when an investigator has some starting point. For instance, we know there was a Boston bombing. Can we go back in time and figure out who these people communicated with? Who were the bombers in touch with? So if there's a directed investigation with a person doing the interpreting, metadata is key — that's how analysis works. However, the idea of finding the bombers in advance, the idea there's some magic button that sort of analyze the world and understand people has not proved to be fruitful. It just hasn't worked, and I suspect it won't work.

So if it's happening at all, what is its purpose? What can they get out of it? What I'm concerned about is that just as an investigator might be able to use metadata in a directed way in an investigation, so could a bad actor. So what I'm asking is if there's a Snowden, why cant' there be a blackmailer? If there's a Snowden, why can't there be a government program that targets certain parts of the population who are in political disfavor, making it a little harder to get loans or something like that?

Lanier on "computational equity" for individuals:

So one idea is a sense of equity that individual power, individual self-determination can't be overwhelmed by a remote entity like the government. I think that's a fairly concrete statement. And right now because of computation — if everybody's sharing information freely, whoever has the giant computer does have such as edge over an ordinary person who doesn't have the giant computer that there is that sense of being overwhelmed. And so restoring some kind of an equity there ought to be possible by changing the architecture of how we do things. Making information cost money seems to me to be the simplest approach, but there are other ideas that could work too. So I think a concept of computational equity is at least concrete, and I think we can gradually start to talk about that.

Lanier said that the people he's met from the intelligence community were "good people." As for Silicon Valley, he said we have "a remarkably pleasant and good-natured elite." But his praise didn't come without this word of caution:

The problem is if we set up systems that are inherited by people who aren't so great, it really doesn't matter that the earlier people were nice. We have to assume the worst from people when we design our systems. Even if you think Snowden is villian, you have to appreciate that he brings to light that our current systems don't foresee villains adequately. And I'm not sure if he is; I'm still on the fence about him right now. But we're definitely not being paranoid enough about ourselves, which is the most important thing. We're much more paranoid about shadowy dangers that are less clear, but our own danger to ourselves is the most clear one that we have to be the most attentive to.

Lanier on the materialist ethic and the need to treat people as "mystical, special creatures":

It's a funny thing: A lot of people in technology have such a mechanical worldview at this point. They think of people of computers and the giant net as making a giant computer out of the people. There's this sort of materialist ethic that's become so powerful that in a way they don't really believe in free will anymore. There's a whole kind of philosophy in which the idea of liberty doesn't quite make sense in the way it might to other people. That's another level at which we have a conversation. Technology people have to treat humans as being sort of mystical, special creatures — we're not doing enough of that.

From Tom's Reading List

The Nation: The Meta Question: What Is The NSA Doing With Your Metadata? -- Ever since we learned about PRISM, the NSA’s secret project to collect metadata on everyone by tapping into commercial online services, we've been confounded by a tangle of intangible clashing values. We are asked to balance 'preventing terrorism' against 'protecting privacy.' It is hard to demonstrate what terrorism would have occurred without preventative measures, and privacy is as much a feeling as a circumstance."

The Washington Post: Tech Companies Urge U.S. To Ease Secrecy Rules On National Security Probes -- "Calls for greater transparency, rather than new limits on government powers, have been the main public fallout in the days since The Washington Post and Britain’s Guardian newspaper reported that the NSA was collecting and analyzing data flowing through nine U.S. Internet companies. The program, called PRISM, reportedly was focused on foreigners but also collected data on U.S. citizens and residents that could, under certain conditions, be reviewed by officials."

The Atlantic: How Big Is The NSA Police State, Really? -- "But to answer that question, we needed to answer three other questions. What information is being collected in the surveillance operations? How much of that information is the NSA housing? And, how much space would saving that much information actually take up? What we learned from talking to a variety of experts is that the calculus is not simple, and any answers are largely estimates. But we got answers."

Foreign Affairs: The Rise Of Big Data -- "The Internet has reshaped how humanity communicates. Big data is different: it marks a transformation in how society processes information. In time, big data might change our way of thinking about the world. As we tap ever more data to understand events and make decisions, we are likely to discover that many aspects of life are probabilistic, rather than certain."

Earlier Coverage

We spoke about surveillance, national security and the U.S. Constitution with Glenn Greenwald, reporter for The Guardian, who broke the story about NSA whistleblower Edward Snowden.

Read transcribed highlights or listen to the full hour here:

This program aired on June 13, 2013.

Advertisement

More from On Point

Listen Live
Close