Skip to main content

Support WBUR

How Russia is trying to influence the 2024 election

47:21
U.S. and Russian national flags wave on the wind in Moscow's Vnukovo airport, Russia, April 11, 2017 to welcome a U.S. dignitary.  (AP Photo/ Ivan Sekretarev, file)
U.S. and Russian national flags wave on the wind in Moscow's Vnukovo airport, Russia, April 11, 2017 to welcome a U.S. dignitary. (AP Photo/ Ivan Sekretarev, file)

Russia’s efforts to influence American voters in this year’s elections are getting more advanced.

They’ve even used American media personalities to try to sow division. How can we protect election integrity?

Today, On Point: How Russia is trying to influence the 2024 election.

Guests

Will Sommer, media reporter at the Washington Post. He specializes in covering conservative media and conspiracy theories.

Gavin Wilde, senior fellow in the Technology and International Affairs Program at the Carnegie Endowment for International Peace. Served on the National Security Council as director for Russia, Baltic, and Caucasus affairs from 2018 to 2019.

Nina Jankowicz, co-founder and CEO of The American Sunlight Project. Former head of the Department of Homeland Security’s Disinformation Governance Board. Author of “How to Be a Woman Online: Surviving Abuse and Harassment, and How to Fight Back."

Transcript

Part I

MEGHNA CHAKRABARTI: On November 1st, 2023, right wing YouTuber Matt Christiansen told his roughly 240,000 subscribers that he had an exciting new opportunity.

MATT CHRISTIANSEN: Lauren and Liam approached me over the summer and said, Hey, We're building this new project. It's going to be a central hub for some conservative or libertarian or just outside the box content. Are you interested in contributing to that?

CHAKRABARTI: Lauren and Liam are Lauren Chen and Liam Donovan, conservative Canadian influencers. And their new project was Tenet Media. A YouTube channel for a kind of super group of six right wing influencers.

Matt Christiansen, Dave Rubin, Tim Pool, Benny Johnson, Lauren Southern, and Tayler Hansen. Tenet Media's tagline was, Fearless Voices Live Here. Its promotional video featured dramatic shots of the commentators bathed in purple light.

(MONTAGE)

I think I've [expletive] off basically everybody. That's what happens when you're a free thinker.

Free speech is under attack that we self censor. Maybe they take us down individually. It's a lot harder to do when we're grouped together in this way.

When the media says dissident or antiestablishment, when I hear it, I think based, very cool.

Independent media, actual media's most important job is holding these people accountable.

The challenge for all of us is to not be passive participants in letting the country slip away.

CHAKRABARTI: But less than a year after it launched, Tenet Media came crashing down. This month, the U.S. Department of Justice alleged Tenet was illegally operated behind the scenes by employees of the Russian funded RT, formerly known as Russia Today. The DOJ indicted two RT employees saying Tenet was a foreign influence operation, funded with some $10 million in Russian money.

But the right wing influencers that appeared on Tenet say they had no idea Russia was allegedly behind the operation.

TIM POOL: When I first learned of this story, literally I'm skateboarding. And I get a DM from a journalist asking if I would talk about the Russian allegations or whatever and I was like, what? I was like, what allegations? I'm like, dude, I have literally no idea what you're talking about.

CHAKRABARTI: That's Tim Pool on his show, Timcast IRL, the day the indictment came down. Pool has around 1.4 million YouTube subscribers. And according to the indictment, Tenet paid him $100,000 per episode of a weekly show he hosted.

But Pool claims he and other commentators were deceived. And he said Lauren Chen, who allegedly funneled the Russian money to Pool and the other influencers, had no input on what he talked about on his show. Here's Pool talking with conservative commentator Ben Shapiro.

BEN SHAPIRO: Was Lauren ever talking to you about the editorial content of the show or anything like that?

TIM POOL: I barely even talked to Lauren. And it's crazy, I've known her for a long time, so when she said that she was launching a company, she had investors. I'm like, okay. I hear that 50 times a year. The first episode that actually appeared on Tenet was about skateboarding. It's crazy to hear that they're saying, the media's jumped the gun completely on what the story is.

CHAKRABARTI: Dave Rubin, who has some 2.5 million YouTube subscribers, allegedly received $400,000 a month for 16 videos on Tenet Media. He also claims he didn't know the $400,000 a month was Russian money. Here's Rubin on The Megyn Kelly Show.

MEGYN KELLY: That leads me to the tough questions that people want answered. Number one, now that it was the Russian money. Do you give it back?

TIM POOL: I don't know who I would be giving it back to. The Russians that duped me? I'm not sure. I'll decide what to do with it. I don't even know what, I don't even know what the honest, I did the work. I did a show and I ironically, I'm quite proud of the show.

We did a silly show about viral videos where it wasn't even political largely, we did. It was like, fat girl goes into Wendy's and throws hamburger at somebody. It was nonsense. The gimmick of the show actually.

KELLY: That's not nice, Dave.

POOL: People like that on the internet.

KELLY: Did anybody ever tell you what to report, what to say or what not to report and say?

POOL: No, all of my conversations with Lauren, all she kept saying was ... we want heterodox thinkers.

We want you to do whatever you want to do.

CHAKRABARTI: YouTube has removed Tenet's channel from its platform, along with channels run by Lauren Chen. Neither Chen nor her husband, Liam Donovan, have publicly commented on the allegations. Now, whether the commentators were aware of the Russian involvement or not, what might the Tenet Media examples show us about Russia's attempts to influence American voters perceptions and beliefs in this year's elections?

And as Russia's tactics get more sophisticated, are the government and social media platforms using the same tools? They were back in 2020 to fight disinformation. So that's what we're going to talk about today. And we're going to start with Will Sommer. He's a media reporter at the Washington Post where he specializes in covering conservative media and conspiracy theories.

Will, welcome to On Point.

WILL SOMMER: Thanks for having me.

CHAKRABARTI: Okay. So first of all, for people who aren't familiar with the conservative influencers who were swept up in this whole Tenet controversy, can you tell us a little bit more about them?

SOMMER: Sure. We've got a handful of people who are pretty significant deals and not necessarily outside of conservative media.

And so maybe that's why some of your listeners haven't heard of them. But people like Dave Rubin, Tim Pool, the guy named Benny Johnson, these are people who on their own have millions of subscribers each on YouTube, who are pulling in even aside from Tenet Media, millions of dollars easily a year.

Obviously, I can't look directly into their finances, but they are very influential. And so these are people who are big stars, on the online.

CHAKRABARTI: Okay, and so then what does the DOJ allege that they were pulled into?

SOMMER: Yeah, so the indictment alleges that essentially Lauren Chen and her husband ... Lauren Chen is a right-wing YouTuber herself who is working for Glen Beck's company.

And that basically she created this operation, Tenet Media, at the direction of Russian agents, employees of RT, with the idea of spreading a sort of pro Russia message in the United States. And then she went on to recruit people who I'm assuming were her friends. They swam in the same circles. People like Dave Rubin and Tim Pool and that they duped them so that they would create fake personas to say, this money isn't coming from Russia, it's coming from this Belgian businessman we've invented. And so that they then, everyone pooled their videos into this one YouTube channel.

CHAKRABARTI: Okay. We'll come back to the fact that these YouTubers, these influencers, say that they had no idea where the money was coming from.

But let's talk about the content specifically. If the allegation is that this was a Russian attempt to influence thinking around the election. To the point of, let's say, Rubin, who said he was just putting out like insulting and fun, funny, insulting videos that go viral, that didn't necessarily have any political content in them at all.

How does the DOJ say that's supposed to advance Russia's interests?

SOMMER: It's interesting. There really wasn't a ton of pro Russia content on Tenet Media. I was watching just as someone who follows this world. And if you had said to me, do you think that's Russianized? I don't think so.

I think the past Russian influence operations we've seen have been much clumsier and have come across as, they give themselves away when they start talking about, let's say, how Putin's doing such a great job invading Ukraine or the Wagner group, stuff like that.

But this one was, I would say, much more subtle. And then, so it makes you wonder what exactly did Russia get out of it? One thing, perhaps, is just sending enormous amounts of money to influencers they see as advancing their interests in the United States.

CHAKRABARTI: Meaning that's a loss leader and hoping that they would do content later on that would perhaps more serendipitously align with Russian goals?

SOMMER: Yeah, you look at someone like Tim Pool, for example, who has a video and this all sort of was reviewed in light of Tenet Media, but who had a video on his own platform saying, Ukraine is the enemy of the United States, Ukraine wants World War III, Ukraine's going to destroy the world. If you're sitting in the Kremlin or in the offices of RT, you might say, this is a guy who even unwittingly, would be a great guy for us to advance.

There's very little evidence in the indictment of a Russian saying, we want you to say this specific thing and we'll give you a sack of money. And yet, a lot of these people involved do have, I would say they have beliefs that I'm sure they came to honestly. But that coincide with what Russia would like to promote in the United States.

CHAKRABARTI: It is possible that Russia or the RT employees didn't even really care what these influencers would say. Because we've seen in election cycles past, that the Russian influence operations weren't necessarily to get Americans to think a certain thing, but rather just to sow discontent, division, chaos, flood the zone with bleep, to use Steve Bannon's phrasing from 2016.

What do you think about that?

SOMMER: I think there's a lot to that. You look at Tenet Media. Even the aesthetics of Tenet Media are jarring. It had all these like purple lights and stuff like that. And yeah, I think the idea of, certainly again, returning to Tim Pool, this is a guy who talks constantly about how the United States is on the verge of a Civil War. And anything will happen in the news, and they'll say, Oh my gosh, the Civil War is almost here. In this really overwrought way. And so if you're Vladimir Putin, you think, Gosh, it would be great to have this guy just constantly talking about how America is on the verge of collapse.

CHAKRABARTI: Now, let's get back to the fact that all these influencers say that they had no idea where the money was coming from. You said a little bit earlier that these were already very successful folks on social media. Was there the money that they were making prior to Tenet?

Did I hear you properly saying that some of them were making money in the six figures or seven figures already?

SOMMER: Yeah, I think just in the case of Tim Pool and Dave Rubin I think it's easy to say that they're clearing probably a couple million dollars a year in terms of their companies. And so it's not as though these were people who were making $60,000 a year and then somehow Russia came and gave them $5 million a year.

That said, it's not like they were making $100 million a year and this is a drop in the bucket. This, I think, easily could have doubled their incomes for the year and possibly more. And so it gets to this point where I think Dave Rubin or Tim Pool has claimed that this is market rate for their videos.

I think that's a stretch. I think even from their perspective, I think this must have been a very notable amount of money. And to hear Tim Pool, for example, say that he never talked to Lauren Chen after she hired him and yet was paying him roughly $5 million a year, is just wild.

Part II

CHAKRABARTI: Gavin Wilde joins us now. He's senior fellow in the technology and international affairs program at the Carnegie Endowment for International Peace.

He also served on the National Security Council as Director for Russia, Baltic, and Caucasus Affairs from 2018 to 2019. Gavin, welcome to On Point.

GAVIN WILDE: Hi, Meghna. Thanks for having me.

CHAKRABARTI: Okay, so let me actually, regarding this Tenet Media story, let me inject some skepticism in here. Because, first and foremost, I want to acknowledge that some people out there, in legal circles, will want to point out that this is an indictment that's been handed down. And there's some skepticism about how much water indictments hold versus actual court cases.

But secondly, I come back to the point of, let's presume that what's contained in the indictment is true for a moment, that Russia, Russian operatives paid all of this money to advance what seems like nonsensical content on social media platforms in this country. What's the point?

WILDE: I think I share your skepticism Meghna, because I think this indictment in particular shows what I've said for a long time, is on Russia's part, probably a bad investment.

I think so much of their, certainly on social media, but a lot of their broader propaganda efforts seem to be pushing against open doors and capitalizing on extant political and socioeconomic themes. And trying to take credit for creating them or exacerbating them when there's endless evidence of their attempts to shape cognition or to shape public opinion.

But evidence of impact is much harder to come by. And it's a much harder scientific question to try to tackle. So with regard to this case, I think it's a positive thing and it's important that the U.S. is being very legalistic about this. This is not about content. This is about transparency about where your funding is coming from, about the oversight of this particular operation.

And I think that's a positive inflection point in how we talk about Russian interference.

CHAKRABARTI: Yeah. I take your point completely, that the strongest aspect of the DOJ's indictment might be the unregistered foreign agent aspect of it, which is an important part of protecting U.S. elections. I do, I have to say, memory is not serving me right now, Gavin, but it is interesting to me that here we have, at least according to the indictment, money flowing to first Canadian folks and then into the pockets of U.S. media influencers, rather than content farms and trolls from say, Moldova, from elections past. So is that different, that the money coming directly, eventually into U.S. pockets to do whatever operation?

WILDE: I think I would agree with what Will said earlier, that this kind of laundering through intermediaries is indeed a little bit more sophisticated than some of the slap dash social media trolling that we've seen in years past. But that said, I again think it's a lot of preaching to a choir of already converted. And in the case of Tenet Media, a deeply converted choir. So I don't know how much bang for the buck, so to speak, Moscow got out of orchestrating this.

CHAKRABARTI: Okay. So then in that case with Tenet as part of the picture here, when we look across the 2020, excuse me, the 2024 election, Gavin, again, to inject some healthy skepticism into this conversation. Is there unnecessary panic about Russian influence operations this time around? Is it the same as before, or are we seeing a series of meaningful changes that are worthy of closer inspection?

WILDE: I think it's a case of what you focus on expands. There's since 2016, there's been certainly no shortage of evidence that Russia or Russian aligned actors are trying to do this stuff in the information environment. And the more we look at it and expose it we feel like we're probably countering it.

But if I put myself in Moscow's shoes or in one of these kind of chaos actors' shoes, all of the different media exposures or civil society exposures or tech sector reports about influence operations or certainly sanctions and indictments. Those are basically my marketing materials.

Those are the things I turn around and go to a political patron or a financier in Russia and say, look, what I'm doing is having impact, not necessarily because it's changing anything, but because it's getting attention. And so I do think that we have, we do run the risk of over hyping the threat because it tends to lend Russia a lot more credit than it's due. And it tends to exonerate us and our political leaders for the tone and tenor and policies here at home that really do impact the way people perceive democracy, or the way people perceive elections.

CHAKRABARTI: And that's an important point, right? Because I've been trying to be careful around my language of what these influence operations are, right?

Because back from 2016, I think there was a lot of confusion that Russia wasn't, the Russian influence operations, if I recall correctly, were never alleged to have actually swayed specific votes, but to change people's confidence, or how they think about American democracy and American elections.

Now, you've written a paper where you say that in and of itself, because democratic institutions actually really do rely on citizens having a common basis of information and reality in order to make rational decisions. That injection of further uncertainty is troubling in and of itself.

WILDE: Yeah, I would put it in terms of the way people develop beliefs and behaviors and attitudes is a very hard topic to study. And so it's going to be very hard to prove one way or the other that some kind of operation altered the course of history, or altered people's perceptions conclusively.

But I think what I argue in the paper is that insofar as we do worry that democratic institutions are under threat or that democratic trust is under threat. Yes, we need to make sure that the American public does not lose face in those institutions and political leaders. But the reverse is also true.

We need to make sure that we frame this in such a way that the political leaders and institutions of democracies don't lose faith in their citizenry by assuming that they are so easily duped that they cannot handle essentially being lied to on the internet or that they themselves don't have a role to play in selecting the type of media that they consume.

CHAKRABARTI: Okay, Gavin, hang on here for just a second. Because I want to bring another voice into this conversation. Nina Jankowicz joins us. She's co-founder and CEO of the American Sunlight Project, former head of the Department of Homeland Security's Disinformation Governance Board, and author of How to Be a Woman Online: Surviving Abuse and Harassment and How to Fight Back.

Nina, welcome back to On Point.

NINA JANKOWICZ: Always great to be with you, Meghna.

CHAKRABARTI: Okay. So what I'd like to do is use your expertise to give us a more detailed survey of what you believe we should be looking at insofar as evidence of Russian influences, influence operations, this time around in 2024.

What are some examples that you have?

JANKOWICZ: In addition to this DOJ indictment, which you've gone through in great detail. Of course, there's the doppelganger operation, which has been going on for a long time. And you mentioned this in the introduction to the show. This is fake websites that look like real news websites that have sometimes in this era, AI generated ... or kind of fake news headlines and articles that are meant to incite polarization and drive us further apart from one another.

We've seen evidence of bot networks that are still persisting on platforms like Twitter, despite Elon Musk's attempt to crack down on bots on that platform, allegedly. And in general, I would say that because of the landscape that we're in, where countering disinformation has become this political lightning rod, we have less information from the platforms about what's going on.

The government is a little bit hamstrung in its responses. And so I think the door is open, not just to Russia, but to other nations that may be trying to replicate the Russian playbook. And, I want to couch this in saying, I don't see a Russian bear behind every content creator. Like Will, in the first interview, I wouldn't have thought that these Tenet guys, I wouldn't have never thought that, I would have never said that they were being paid by Russia to create these videos until this indictment came out. But I do think that when you see emotionally driven and manipulative content online, it's important to ask yourself who's manipulating you and why. And whether they're coming from Russia, another country.

Or someone here in the United States. As a content creator, if they're driving that sort of emotional content, I don't think they've got your best interests at heart.

CHAKRABARTI: Platform transparency. I'm putting a star next to that one in my notes, we're gonna come back to that in just a second.

But, give us your diagnosis though. Do you think we're more or less protected from influence operations, whatever they might be, than we were in 2016 or 2020.

JANKOWICZ: It's really hard to, I think, diagnose that. Because on the one hand, as Gavin said, we have these disclosures coming out from the U.S. government, despite the political circumstances, and I think that's a really good thing. I'm happy that they're happening. I think there's a high probability they go away if Trump is elected. But on the other hand, we've got an electorate that is more polarized than ever, where people aren't willing to see the gray area, the nuance. And people who are, I think, pretty gullible, especially on the fringes of the political spectrum.

Who, yeah, maybe their votes aren't getting changed, but in some cases, they're being radicalized to violence. And again, here I'm talking not just about Russia, but about broader disinformative content that is driving people to take action offline. So yeah, I'm not particularly optimistic about the state of our information environment as we head toward November.

CHAKRABARTI: Okay, so Gavin and Nina, let we play a little bit of sound from a Senate hearing that took place not long-ago, regarding influence operations in the 2024 election. So this was before the Senate Intelligence Committee just last week, and one of the people that appeared before the Intelligence Committee was Kent Walker, President of Global Affairs at Google and its parent company, Alphabet.

And he told the committee that YouTube, which Google owns, is using AI tools to screen for bad information. And here's Walker talking about that with Maine Senator Angus King.

KENT WALKER:  YouTube has gone from having one view in a hundred violating our policies to one view in a thousand. And that's a large part because we are using AI to detect some of these patterns of myths and disinformation that are out there and take action against them.

ANGUS KING: You either can take action or you can alert your customers that this has been manipulated in some way.

WALKER: Agreed. And also provide high quality authoritative information. The old line “the best remedy for bad information is good information.” So the more we can promote accurate information about when the polls are going to be open, people's eligibility to vote, that's an important part of the democratic process.

CHAKRABARTI: So that's Kent Walker, President of Global Affairs at Alphabet, before the Senate Intelligence Committee last week. Okay, so I actually think that in that seemingly quotidian interaction, there's quite a lot going on.

And Gavin and Nina, I'd love your help in parsing it. So first of all, Walker there says bad videos have essentially gone from one view in a hundred, in terms of violating YouTube policies, to one view in a thousand. So an order of magnitude less. Hooray! But the policies part. Nina, do we have, has YouTube been more forthcoming in exactly what it takes for a video to violate policies when it comes to disinformation or misinformation?

JANKOWICZ: Yeah, that's exactly the key question, and why I was laughing before. Because YouTube actually walked back its policies about election denialism not long ago. It had stopped moderating content related to the 2020 election, and claims that it was rigged. So while it might not be identifying through its AI systems content that it deems to be violative of its policies.

Its policies have gotten notably broader and more permissive in the last couple of years. And I just want to say also, I'm really glad that Kent, Mr. Walker was up on that panel, because Google has largely escaped scrutiny for a lot of the content that is being spread. So many people watch YouTube, use it as a source of news.

And the fact that they're up there, hopefully getting grilled a little bit more harshly in the future. I think is good for transparency. But still, we have to recognize that there's been a step back in the types of policies that a democracy might want to see these companies taking ahead of a consequential election.

CHAKRABARTI: So Gavin, this leads me back to something that you wrote in this paper that appeared in the Texas National Security Review. Because when you talk about in that paper why, due to this very fraught information environment we're all living in, that there is a very profound temptation for governments to essentially, in your words, intervene in the information environment, but that carries profound risks of backfiring.

OK, so given what Nina just said about YouTube walking forward and walking backwards, its own policies about political misinformation, what can governments do, if anything?

WILDE: This is a thorny issue. Because, on one hand, you know, everyone's looking to regulators and certainly to the National Security Bureaucracy to try to do something about this perceived encroachment by foreign actors on what is our benchmark exercise of democracy.

On the other hand, certainly in democracies, you don't want to lend the government a lot of capacity to somehow arbitrate what the truth is or isn't, or wade into these issues that are certainly politically charged. And so there's a bit of a Catch-22. And certainly, I think, as amazing a job as I think the national security bureaucracy and the intelligence community are doing, they are very confined in what they can do in terms of legally, to cut down on what is essentially false speech.

But there's also a degree to which they're forbidden from even trying to ascertain the impact on the domestic population, because that's not their charter. And so they're stuck in this space where they have to, and rightfully do, detect this and warn us about it, but they're not really in much of a capacity and don't have many levers legally or institutionally to try to say, this influence operation worked.

This one didn't. And so it's very tough for governments to try to wade into. Let me just ask you a quick question, about a minute before we have to take our next break. But Gavin, I am a profound supporter of the First Amendment. And I don't want to underestimate how important that is in this issue.

But I'm trying to think of a good analogy of when governments have felt that they must step in to protect the public, right? Okay, Dow Chemical, the government's not going to tell you what chemicals you can make, until those chemicals show that they have hurt people, right? Or they have to show that they won't hurt people.

In an information environment, when we're talking about the health of our democracy, which really can end up hurting people if it's unhealthy, shouldn't there be a threshold in which the government can step in?

WILDE: I think that the risk we run is that we become very much like Moscow itself, where we start viewing the public as if it's inert and dim witted. And we distrust the public and its ability to tolerate dissenting views, and even lies and misinformation. Because democracy is basically founded on that idea that we are resilient enough to withstand those things, and so I think that resilience part is the thing we have to key in on.

Part III

CHAKRABARTI: Nina, I wanted to hear your thoughts on what Gavin said a little earlier. Because I think it's at the heart of the bind, if I can put it that way, of what we can do in order to minimize the impact of disinformation or misinformation campaigns from foreign actors.

Now, Nina, you have been as clear as anyone in saying that, oftentimes, particularly with Russian influence operations, they're not saying vote for X person or oppose Y policy. As you said a few minutes ago, the goal is just to increase tension and polarization, specifically in Western democracies.

So given that, Gavin just said a minute or two ago that the government, the United States government acting too forcefully to reduce the impact of those influence campaigns risks making Washington a lot like Moscow itself. What's your response?

JANKOWICZ: I wouldn't go quite as far as Gavin did.

I think while I agree that we have to be very careful about the restrictions that we might place on this stuff, I actually think that there is a way to approach all of it, and inform the public and protect the First Amendment and people's right to seek out and respond to and amplify all sorts of information without restricting any sort of speech.

I think one of the things that we're missing now, and again, we're going to get back a little bit toward that platform, transparency conversation, Meghna. Is the fact that an indictment like this a couple of years ago would have been actually quite coordinated with the social platforms. We would have seen probably YouTube taking down that content much more quickly than they did.

Now they did take it down quite quickly, but they would have gotten a heads up from the DOJ. We might've seen the platforms releasing some data to anybody who wanted to get their hands on it and dig through it to draw some conclusions about what was going on with these platforms, the view counts, the engagement, et cetera.

But because of the political environment that I mentioned before, this equation of counter disinformation work with censorship. Nobody is doing anything like that anymore. And let me be clear, the type of regulatory environment that I am in favor of, that I have always been in favor of, is not making certain types of speech illegal, especially given that Russia is exploiting preexisting fissures in our society.

That would be really dangerous. What I think we do need is more transparency on the platforms. So that we understand what steps they are taking to respond to this content. Because they want to be the public square. They need to act like the public square, and they need to make sure that there's not garbage littering it.

They need to make sure that they're enforcing the policies that they already have on the books, and they do have the power to do that. These are private platforms that have terms of service for everybody that uses them. So that's what I think we need to move more toward. And ironically, the conversation in this country has shifted so far down the field that we can't even agree that more information about what's going on the platforms is a good thing.

That's really sad to me.

CHAKRABARTI: But Nina, the platforms themselves say, they come hat in hand to the Senate Intelligence Committee and they say, sirs, we are being transparent. What more do you want from us? Let's listen. Let's listen. This is Nick Clegg, interestingly, the former head of the United Kingdom's Liberal Democratic Party.

But for the past several years, he has been President of Global Affairs for Meta, aka Facebook, and Clegg was at the Senate Intelligence Committee hearing last week, and here's how he talked about transparency on Meta's platforms, and he started with speaking about better user controls on social media.

NICK CLEGG: On our services, you can just turn the algorithm off. You can just have it chronologically delivered instead. You could click onto the three dots and you see exactly why you're seeing a post. You could say you don't want to see certain ads. You can prioritize certain content and not. I think user controls are crucial.

And secondly, we need to be transparent. We need to be transparent about what are the. Signals that we use in the algorithms. We need to be trapped. We publish alongside our financial results every 12 weeks, for instance, full transparency report showing how we act on content that violates our policies. We have that audited by E.Y. So we're not sort of marking our own homework, if I can put it like that. User agency and sort of control and a maximum amount of transparency for the companies of the key are the key ingredients here.

CHAKRABARTI: Okay, and by the way, I'm not going to ask forgiveness for my ocean of alligator tears. I cry for these platforms because they have too much money and too much power for us to not push them as hard as we can whether it's just not necessarily legally, but morally, to do as much as they can to help preserve the democracies that gave birth to them.

So here's a little bit more of Clegg talking about how Meta, according to him, is taking down misinformation and disinformation networks. And the fact that he says Meta shares that information publicly.

Every time we find networks like that, we need to share that as widely as possible with researchers, with our colleagues in the tech industry, with government.

So for instance, we now publish every 12 weeks, an adversarial threat report, done so in the last few years. And we blocked around 5,000 accounts and pages in three months, in a three-month period this year. We've placed a lot of the signals that we were able to detect on GitHub, so that everybody can look at that.

NICK CLEGG: Everyone can learn from the experience, we've got people [who] can then scrutinize it. Tell us what we've got right. What we've got wrong. I think that interchange of research and data is crucial to develop public and societal resilience in the long run.

CHAKRABARTI: Nick Clegg, President of Global Affairs for Meta, before the Senate Intelligence Committee last week.

Gavin, let me turn to you first here. He is saying that every 12 weeks they are publishing an adversarial threat report. So at least that's something. How would you gauge Meta's claims on transparency there?

WILDE: I think it's laudable and that work is great, but I think more broadly, we over index on social media in this space.

And I think these kinds of discussions and a lot of those publications only serve to cement and legitimize these platforms claim, to be the source of our politics, to be the source of our attitudes and our behaviors. And legitimizes this idea that data about users somehow contains the secret to human persuasion, and human cognition and social psychology.

One need only tinker with a chat bot or a large language model to see how misguided that entire philosophy is. And so I think both Russia and a lot of Domestic commentators in the U.S.

And the platforms themselves are still operating as if the problems we're talking about are somehow rooted and originate on social media and online.

And while I don't discount the fact that plays a role, I would argue that our politics are rooted far more in real physical space and in real human interactions and human conditions, and all of this discussion around social media is a distraction from that fact.

CHAKRABARTI: I will not argue against your point that our politics are rooted in real things and real experiences in life.

But you can't possibly be saying that the algorithmic amplification that we see on social media, where millions, tens of millions of people in this country, hundreds of millions, at least partially turn to social media for information, that doesn't matter in part shaping how people think about those core political identities that they have.

WILDE: I would certainly not disagree with you. I would say it matters in an emergent way but certainly not in a way that any one entity or set of trolls can somehow tap into and steer as simply as a lot of these influence operations purport to do.

CHAKRABARTI: Okay, so point taken. But you said over indexing social media and not necessarily over indexing the influence of the trolls.

And I do think that's an important distinction, Gavin.

WILDE: Definitely fair. But again, your sentiments about your political viewpoints, I think, are probably more reinforced online than are found or are identified online necessarily. So I think, again, lending the consumer of information a little bit of the benefit of the doubt that they're not just simply falling haphazardly down rabbit holes, but that they also have some agency and some accountability, and some decision making in what kinds of media they encounter online.

And that's deeper than I think the algorithm portrays it.

CHAKRABARTI: Point well taken. Absolutely. As individuals, we are not mindless consumers of information. We shouldn't be. And I'll come back to that in a few minutes, but Nina, I wanted to hear your response to that.

JANKOWICZ: Ah, I understand Gavin's point that the social media platforms are not the be all, end all of our political discourse right now.

But the fact that, and Meghna, you and I have talked about this at length on your program. I personally have been the victim of a campaign of lies that started online, moved to traditional media, was amplified there, and then amounted to threats against me and my family. And I am far from the only one that has happened to.

It started on the internet, right? It was seeded there and then exploded to mainstream media platforms. I think, we, this is a big force in our politics today. It is a big force in radicalization, and we can't discount the fact that Clegg and others are ignoring that they have restricted data access for researchers who are trying to shine light on the problems that we see in our societies right now.

And a lot of those problems are starting and being seated on those platforms. Clegg neglected to mention that he shut down. Not he specifically, but Facebook shut down CrowdTangle, which prior to this was the only way to get insight into what was going on public areas on Facebook. They haven't really provided a suitable replacement for that.

So all of these adversarial threat reports, to use his term, they are grading their own homework. Because we have nothing, no data to compare it against. This has been permitted essentially by the structure that Elon Musk created, when he went from Twitter, giving us troves of data, to not giving any of that data and monetizing the application programming interface that allowed researchers to access data on Twitter.

So everything that Musk does has made everybody else look like a golden boy. And I will note Twitter, X, did not even show up to that hearing at the Senate Intel Committee. So we're in the dark here. And I really do just want to note that there are many people who have had their lives affected in the way that mine has been affected by the internet.

And so while, again, I am the first person to lobby for more media literacy, I do think we have to put some emphasis on the fact that the platforms are permitting and creating structures which are radicalizing people.

CHAKRABARTI: And to the point regarding the value of the data they collect, it has to have some kind of value.

I was just looking it up just now. Meta's market cap is $1.4 trillion. Investors believe that they've got something that is extremely lucrative.

... Now we've only got a couple of minutes left, but Nina and Gavin, I do actually want to talk a little bit more about an area which might bring both of your perspectives closer together, because Gavin you had mentioned earlier about individual agency.

Absolutely being a critical part of keeping our collective democracy healthy. And Nina, you talked about media literacy, and I had some tape here ready to go, but of course I've run out of time to play it. But in the Senate Intelligence Committee, hearing some several Baltic states were brought up repeatedly. Saying for years that they've been engaging in media literacy at the youngest ages, it's required for like kindergarten through roughly fifth grade or even higher there, that citizens of the Baltic states are tasked with going out and onto the internet and combating Russian misinformation. And that over time, perhaps that has actually made a difference.

So that's a combination of let's say, policy and individual agency. And I'll let both of you take, give us your thoughts on that. But Gavin, is that a path that the U.S. should try to pursue a little bit more?

WILDE: I think there's certainly a quality there to try to model and mirror. The only asterisk I think I would put to it is these are much smaller, much more culturally and socially homogenous societies than the United States grapples with and policies that have a much different relationship to authority and the government, than the U.S. certainly has. So I think finding a way to borrow those best practices or those models while factoring for the very unique position the United States has found itself in, is probably a tough path to go down, but one that's worth trying.

CHAKRABARTI: Nina, I did hear your suppressed smile, laugh, grin about trying to advance media literacy, or the comparison to the Baltic states, but go ahead.

JANKOWICZ: Yeah, no, I think it's cute that the Senate has finally caught on to the fact that media literacy, information literacy more broadly is a good thing and that the Baltic States and others like Ukraine have been doing it for years.

I think we talked about that on your show four years ago, so glad they're catching up. But I think this is something that the United States, with all its resources, all its ingenuity, absolutely can try to replicate. It's going to look different, as Gavin said, then in a small country like Estonia, we might have to do it at the state level, given the state ownership over the education portfolio, or perhaps through civil society grant making as well.

But this is a generational investment that we need to make, and we need to start doing it. And just to be clear, for those that might not be familiar with these concepts of media or information literacy, it's not about saying, X, Y, Z, platform or newspaper good, this other one is bad.

It's about giving people the tools they need to navigate today's information.

CHAKRABARTI: Critical thinking. Yeah.

JANKOWICZ: Yes, exactly. Things like, why do you get an ad for this particular vacation brand that you were looking at on your computer when you turn on your Instagram a couple of minutes later, right?

Understanding the ways you're being targeted, understanding that emotional manipulation I talked about at the top, that's all stuff that we could all do better with, and I would love to see some investment in the long term in all of that.

CHAKRABARTI: Yeah, because the matter, the fact of the matter is that this is reality now, right?

Information as a tool for cross border influence campaigns is what's going to happen, not just the U.S. being on the receiving side, but on the sending side of that. So democracies have always relied on a healthy flow of information, and the ability of people to process that information with a sense of shared reality.

So protecting all that, as you said, Nina is absolutely a project of generations.

This program aired on September 23, 2024.

Related:

Headshot of Claire Donnelly
Claire Donnelly Producer, On Point

Claire Donnelly is a producer at On Point.

More…
Headshot of Meghna Chakrabarti
Meghna Chakrabarti Host, On Point

Meghna Chakrabarti is the host of On Point.

More…

Support WBUR

Support WBUR

Listen Live