Support the news
Imagine a not-so-distant future where you’re just driving on the highway. Your car is sending real-time data about your performance behind the wheel to your insurance company. And in return, the insurance company is sending real-time driver behavior modification punishments back — real-time rate hikes, curfews, even engine lockdowns. Or, if you behave in the way they like, you get an instant rate discount.
In other words, the insurance company is shaping your behavior right then and there. Would you like that? What does it mean for our entire understanding of free will?
That future may not be far off — and it started with Google’s sucking up your data to serve you targeted ads, then Facebook’s selling of that data to groups who want to influence your votes. And Harvard big thinker Shoshana Zuboff, author of "The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power," says where it goes next could threaten democratic governments and personal autonomy.
On what we mean when we talk about "surveillance capitalism"
"Surveillance capitalism is an unprecedented approach to making money, for lack of a better word, capital accumulation, and here's how it works: It unilaterally claims private human experience as its own commodity that can be translated into behavioral data which can be then sold and purchased in a new kind of marketplace that trades exclusively in predictions of our future behavior, what we will do now, soon and later. What we immediately see in this remarkable new equation is that we've shifted the whole premise of exchange and how capitalism works from a relationship with customers to a relationship with people that is one of simply raw materials. We are no longer the customers of capitalism, we are its free source of raw materials, from which predictions of our behavior are fashioned for trade in these new and, in a certain respect, ominous new futures markets that I call 'behavioral futures markets.' "
On an example of this kind of market
"The thing about these examples is that they literally are all around us. I write in the book about surveillance capitalism being discovered, invented, elaborated at Google, migrating to Facebook, becoming the default option for Silicon Valley apps and startups and so forth, but now it's everywhere. It's across every economic sector.
"Surveillance capitalism was being invented between 2000 and 2002, and it was being applied at Google. Here's something interesting: 2002, a comprehensive review of telemedicine, written that year before anyone knew about surveillance capitalism, because Google kept it quite a secret. So here's this comprehensive review, and there are data scientists and engineers involved in this review, and the review includes a diagram for a proposed digital architecture of telemedicine. And that diagram is a simple closed loop. It includes three nodes — one is the patient in her home, one is her physician and the other is the hospital where the server is located. The whole idea here is that a person's health data ... that is the essential elemental property of the person. My body, my information.
"What they found is that just by downloading the software for the app, it automatically authorized the collection and even the modification of sensitive personal information on your phone."Shoshana Zuboff
"That's 2002, before we had heard of surveillance capitalism. Now let's fast forward and I want to take us to the year 2016, where there are now more than 100,000 'mobile health apps.' So let's drill down in just one version of this. This comes from nothing less than the Journal of American Medicine. Here's a comprehensive study of Android-based diabetes apps. We still want telemedicine, we still want those advantages for us, but here's what they found about diabetes apps. They examined 211 apps. And then some of them, about 70 of them, they selected for deep-dive analysis. What they found is that just by downloading the software for the app, it automatically authorized the collection and even the modification of sensitive personal information on your phone. And then they figured out that 64 percent of those apps secretly modify or delete your information. Thirty one percent secretly read your phone status and identity. Twenty seven percent secretly gather location data. Another 12 percent take advantage to view your WiFi connection and then there's actually 11 percent that go ahead an activate your camera so that it can access your photos and your videos. Finally, they found that between 4 and 6 percent went even further — they read your contact list, they called phone numbers on your phone, they modified your contacts, they read your call logs, a few of them even activated your microphone to record your speech."
On the reasons appmakers give for needing this information
"They have very good reasons. Those reasons are the economic imperatives of surveillance capitalism. And the operations that I'm describing to you are exemplary. These are excellent, highly-skilled operations in surveillance capitalism. As far as helping us, that's really not in the equation when it comes to these backstage operations that are intentionally designed to evade, bypass our awareness. In a way, we're living in this bubble that is constructed to keep us ignorant of the very things I write about in this book and the American Journal of Medicine was so kind as to bring to our attention with this brilliant research."
"There is no question in mind that digital technologies offer us these tremendous gifts, these great, great riches. The argument I make is that 21st century citizens should not pay for those privileges with the situation that we're in now."Shoshana Zuboff
On the goals of this economic system
"Surveillance capitalism, like other forms of capitalism, is a competitive operation and there are competitive dynamics, and the key competition is around who's going to get the most predictive behavioral data? What is the most predictive behavioral data? So this started out as, 'We need scale, we need volumes. We need to just extract as much as possible.' As the competition heated up, then it became clear that we also need scope. We need a lot of data but we also need varied data, and different qualities, different levels of the person and different aspects of their actions.
"We approached the internet and the wonders of the digital world seeking empowerment, seeking the democratization of knowledge, seeking a whole range of freedom from the frustrations imposed upon us by real-world institutions. And there is no question in mind that digital technologies offer us these tremendous gifts, these great, great riches. The argument I make is that 21st century citizens should not pay for those privileges with the situation that we're in now, which is having been relegated to the status of free, raw material for a new regime of capitalism that is oriented toward business customers."
On regulatory framework for this technology
"Privacy is critical, and we have privacy law. We need more. The anti-trust issues are critical. We have anti-trust laws and they need to be enforced. But neither of those specifically addresses this unprecedented logic of surveillance capitalism. So, my argument has been that, you can't design a vaccine unless you have a deep and accurate understanding of the enemy disease. And that's the history of every successful vaccine, is how to you actually understand the disease, then the vaccine gets a lot easier. So, in this situation, we need to understand how surveillance capitalism works, which is why I've spent a chunk of my life trying to figure it out and share it with people.
"If we break up Facebook, that means maybe we have four smaller surveillance capitalists instead of one big surveillance capitalist."Shoshana Zuboff
"For example, let's say we just broke up Facebook, but didn't drill down into the level of some of these foundational operations that I've been describing. If we break up Facebook, that means maybe we have four smaller surveillance capitalists instead of one big surveillance capitalist. And then we also disrupt the competitive landscape, opening up more opportunities for more surveillance capitalists to come in and compete for the behavioral data and the revenues from the futures markets. Obviously, that's not enough. We need to understand exactly how it works, how it unilaterally claims our experience, how it takes our experience, how it translates it into data, how it claims the right to translate our data into predictions, how it claims the right to sell those predictions to other people and not to us, how it claims the right to do all of this without our knowledge and how it claims the right to maintain the conditions that allow it to have all of these other rights."
Alex Schroeder adapted this interview for the web.
Support the news