Advertisement

Could Facebook Or Google Manipulate An Election?

06:48
Download Audio
Resume
 In this Oct. 17, 2012, file photo, a man raises his hand during at Google offices in New York. People should have some say over the results that pop up when they conduct a search of their own name online, Europe's highest court said Tuesday, May 13, 2014. (AP)
In this Oct. 17, 2012, file photo, a man raises his hand during at Google offices in New York. People should have some say over the results that pop up when they conduct a search of their own name online, Europe's highest court said Tuesday, May 13, 2014. (AP)

Imagine for a moment that, as an undecided voter here in the Bay State, you start seeing ads deliberately filtered into your online life by a human hand — by, say, a politically interested person at Facebook or Google — supporting McCormick or some other candidate.

We see this all the time on the Internet: ads tailored exactly to our wants and needs. But could these ads be used to swing an election? Some researchers say they can. And a study out last month suggests it's easy to do; in India, researchers were able to swing undecided voters over to a particular candidate by more than 12 percent.

WBUR’s Sacha Pfeiffer speaks with a Harvard professor about "digital gerrymandering" and how to prevent it.

Guest

Jonathan Zittrain, co-founder and director at Harvard Law School's Berkman Center for Internet & Society. He tweets at @zittrain.

Transcript

Sacha Pfeiffer: Jonathan, you wrote about this in the most recent edition of the New Republic and you actually coined the term "digital gerrymandering." Could you explain to our listeners what that means, exactly, in real-world terms?

Jonathan Zittrain: Yes. In real-world terms, I was curious about an experiment that some researchers had done in which Facebook had simply put up what you might put up a house ad — it's Facebook's own message to its users in their news feed — on Election Day in 2010 saying, "It's Election Day. Click here to find out where your polling station it. Here are seven of your friends on Facebook who've already voted. Did you vote?" That message was displayed to tens of millions of people and not displayed to some for the purpose of figuring out how many people it might importune to vote who wouldn't have already voted. And the researchers went to the trouble of looking at the voting rolls to see who had voted and who hadn't, and they found a small but not insignificant number of people who were impelled to vote by that small reminder in the news feed.

Kind of a get-out-the-vote effort? Maybe a benign get-out-the-vote effort, you might say?

Yes, absolutely, and it's modeled after the kind of thing where Facebook for a while went on a campaign to encourage people to become organ donors.

If you consider that benevolent, where do you see that it could become insidious?

Well, the gerrymandering part then can come in. And this is now hypothetical, I should be clear. But the question would be suppose Mark Zuckerberg woke up one morning and thought it was really important to the future of the country that one candidate over another win a given election? Couldn't he make a prediction quite easily about who would be voting for the candidate he wants and only show the "here-are-the-poll-locations, it's voting day message" in the feeds of those that he figures would vote his way?

So based on your political leanings, he might determine who he thinks your likely candidate is and keep promoting that candidate to you?

Correct. It's quite clear it's an ad. It's a house ad. It's something that Facebook is telling you. It's not sneaky in that sense. What's sneaky is who isn't getting it and who is harmed by that selective choice. But I think from the research we see, that could very well in certain circumstances cause an election to go one way rather than another.

And this isn't entirely hypothetical because we referenced that research in India where there was an attempt to sway voters and it seems to have worked.

Yes, and that, too, was a classic psychology experiment. They played with a fake web site of a fake search engine run by a psychologist called Robert Epstein, and he found that if he changed the search results you would see when you were asked to Google one candidate's name over another — sure enough, you would affect people's view of the candidates. If Google, for example, were to make sure that all the positive stories about a given candidate were pushed down and the negatives ones came up, that candidate would end up losing some votes.

I was going to ask you if it could go a step further in a nefarious way. So rather than just give ads to you for the candidate that they think you favor, they could actually try to disfavor someone and have you lean another way.

And it wouldn't even be ads; this would be on your actual raw search engine results. And the one thing we have to give us some solace that our lives aren't being manipulated is, 'Ah, it's only a machine. There's some kind of search algorithm. It's just a big complicated black box.' Well, what if somebody wanted to put a thumb on that complicated scale? And should that happen I think it would be quite worrisome.

Jonathan, you have a proposal for how to prevent this, or at least reduce the risk of so-called digital gerrymandering. And that's to have big information companies like Google or Facebook be considered "information fiduciaries." Now, that's a bit of a technical/legal term, but is it basically like attorney-client privilege, where your lawyer can't share personal information about you?

Yeah, that's not a bad example. The question is: could we arrange it so that there's some duty, however light, for Google, Facebook, Bing, you name it, to kind of deal you a straight hand? To mark the ads as ads, but for your results — your search results — they should be working for you. And could we come up with a world in which not every search engine or every information provider has to be held to some standard, but some of them, especially the big ones, might be persuaded to volunteer to do it.

Is your idea that companies like Facebook and Google could not use the personal information they know about you from your web searches and your online postings to them customize what they put into your feed or on your computer screen? Or at least if they do they have to be more transparent about it?

They could absolutely still use it to customize, the way that you would expect a full-fledged stock information advisor who owes you that duty to customize recommendations. It's like, "Oh, you're about to retire? You certainly shouldn't invest in a jackalope ranch." That's very personalized! But it says: have that person's interests at stake. When you ask Siri for advice on your iPhone, trying to anticipate your needs and make suggestions, there's that much more reliance being placed on the intermediary to shape your world and your experience. And that's why, to me, this is a very good time to be thinking about doing something about it. In between too-early-to-tell and too-late-to-do-anything-about-it, I'd rather be in the first zone.

More

The New Republic: Facebook Could Decide An Election Without Anyone Ever Finding Out

  • "All sorts of factors contribute to what Facebook or Twitter present in a feed, or what Google or Bing show us in search results. Our expectation is that those intermediaries will provide open conduits to others’ content and that the variables in their processes just help yield the information we find most relevant. Digital gerrymandering occurs when a site instead distributes information in a manner that serves its own ideological agenda."

The Washington Post: Research In India Suggests Google Search Results Can Influence An Election

  • "With a group of more than 1,800 study participants – all undecided voters in India — the research team was able to shift votes by an average of 12.5 percent to favored candidates by deliberating altering their rankings in search results. There were also increases in the likelihood of voting and in measurements of trust for the preferred candidates, and there were decreases in the willingness to support rivals."

This segment aired on June 6, 2014.

Headshot of Sacha Pfeiffer

Sacha Pfeiffer Host, All Things Considered
Sacha Pfeiffer was formerly the host of WBUR's All Things Considered.

More…

Advertisement

More from Radio Boston

Listen Live
Close