The Rise And Threat Of Fake News

Download Audio
Sheryl Sandberg, chief operating officer of Facebook, right, smiles after a news interview with Megyn Kelly, left, on the show, The Kelly File, on the FOX News Channel, Wednesday, April 9, 2014, in New York. (AP Photo/Frank Franklin II)
Sheryl Sandberg, chief operating officer of Facebook, right, smiles after a news interview with Megyn Kelly, left, on the show, The Kelly File, on the FOX News Channel, Wednesday, April 9, 2014, in New York. (AP Photo/Frank Franklin II)

"Megyn Kelly has been fired from Fox News." "September 11th was an inside job." "The iPhone 8 will have Siri physically come out of your phone and respond to user commands."

All of these are fake news stories, but all of them have been promoted on Facebook's Trending Topics feed over the past several months. With more than 40 percent of American adults getting their news from Facebook, and 1.7 billion people using the site, the question of whether it, and other social media platforms like Google and Twitter, are promoting factual news is more pressing than ever.

Filippo Menczer On How Big The Fake News Problem Is

"Part of the problem is we don't know how big it is. We know that there is a huge amount of misinformation that is flowing around in social media but we are just beginning research efforts to try and quantify the effect. There is in fact a study that came out just today from some colleagues at University of Southern California who used the tool that we developed to detect social bots on Twitter. By doing a large scale analysis of tweets about the election, they found that a very large percentage, about 20 percent of tweets about the election seems to be generated by bots. So that's quite surprising. And we don't really know the effect of that avalanche of misinformation."

On Facebook's Human Editorial Oversight

"There used to be. In fact, those trending topics were vetted by editors and then there was some controversy and some accusations that Facebook editors were possibly biased. As far as I know, those allegations were not really confirmed so we don't know if that was also misinformation, it was based on one account. But the backlash on Facebook was huge and as a result they decided to make the algorithm that completely automated."

On Facebook's Recommended Stories

"My understanding is that Facebook is trying really hard to show a breadth of different sources of information. Sometimes, the mechanism is also very useful because if you're looking at  fake news, sometimes what you see underneath is a link to a fact-checked site that tells you that in fact, that piece of news maybe is not true. It could help and it could also hinder but in general, the things that you see on your feed are highly affected by the mechanisms by which you know social media decides what you may wanna see and what you are most likely to engage with. The things you might be excited about are not necessarily the things that are most accurate."

On If An Algorithm Can Tell Between Real And Fake News

"Not at this stage. It would be nice to get to that, maybe in several years. This is still in the area of artificial intelligence that we are very far from. Our group is doing a lot of research on trying to answer different bits of this question, trying to see whether you can fact check some very, very simple statements by computational methods but we are really at the very beginning. Right now, only a human, and sometimes even for humans it is difficult to discern whether a piece of news is true or not... The problem is that you can't keep up. There is so much misinformation out there."

On Keeping Up With The Volume Of Misinformation

"It is really impossible. The nature of social media, the business model is for them to get people engaged so that they will click on ads. There is now a cottage industry of websites that do nothing other than fabricating fake news... it's very cheap to fabricate fake stories.... If we see something that seems to match what we believe or what we feel, we're more likely to believe it and to click on it... They spread very very fast inside our filter bubbles or echo chambers, among those that are more likely to believe them. And it happens on all sides of the political spectrum. So what to do about it is a very tough question."

On The Potential Conflict Of Interest Between Ethics Of Fake News & Profit

"It's not the case that people at these companies don't care. That being said, clearly, it is tough. And their goal is engagement. And even our goal — we wouldn't use a platform if it's not fun and if it doesn't show us things that we're interested in. Sometimes that's innocuous, maybe cat pictures from our sister. But other times, it could be dangerous if it's misinformation. And the problem is that there is a misalignment between the incentives toward getting people engaged and the incentives to provide accurate information.

"Traditional media, which are beleaguered all the time, they had a very strong editorial role. Their whole role, their business model was to provide accurate and fact-checked information. And now, yes, it's nice that anybody can be a producer of information but the flip side of that is that there is these incentives to produce all sorts of junk."

On What Social Media Users Can Do To Protect Themselves

"I think the first step is for people to be aware that they can be the victim of manipulation. With new technology, we often don't know at the beginning what possible dangers may come from it. If people are aware they can be cheated, just like you know you shouldn't accept candy from a stranger — we teach that to our children — but we don't know how to teach about checking sources, checking accuracy... Generally, you just have to be skeptical."


Filippo Menczer, professor of computer science and the director of the Center for Complex Networks and Systems Research at the Indiana University School of Informatics and Computing.

This article was originally published on November 07, 2016.

This segment aired on November 7, 2016.



More from Radio Boston

Listen Live