How One Organization Is Curbing The Spread Of Disinformation In Black, Brown Communities

Download Audio
Andre Banks, co-founder of Win Black/ Pa'Lante, explains his organization's efforts to stop disinformation from spreading in Black and Brown communities. (Getty Images)
Andre Banks, co-founder of Win Black/ Pa'Lante, explains his organization's efforts to stop disinformation from spreading in Black and Brown communities. (Getty Images)

During the 2016 presidential election, Russian operatives spread disinformation that specifically inflamed racial tensions and discouraged Black voter turnout.

Four years later, these campaigns are back.

Facebook and Twitter have taken down a network of Russian actors who wrote articles spreading false stories about Joe Biden and Kamala Harris’ Democratic presidential campaign to undermine their support.

Social platforms are working to take these Russian-backed posts down, but some like Andre Banks, co-founder of Win Black/Pa'lante, say it's not enough. His network fights against disinformation that targets Black and Brown communities in the U.S.

Win Black/Pa’lante has been described as a “war room” that is pushing against disinformation online. Every day, Banks’ team looks through their social feeds to find out what’s happening in the news — then they dive deep into the corners on the internet to identify where misinformation starts and spreads.

With strategists, graphic designers and video content creators at the table, Win Black/Pa’lante brainstorms how to combat misinformation using their own tactics. They craft shareable material, in hopes it goes viral, to spread correct information on issues such as voting.

“Unfortunately, it's a real fight — but we have to fight on the same terms,” Banks says. “That means we've got to make content that moves through feeds, that catches people's attention and really make sure that good information can compete with all of the bad information that's out there.”

The type of activity his team often detects are automated bots that impersonate people in order to drive conversations, he says. These bots will persistently bait people into arguments or back and forth discussions in order to “drive up attention on a particular piece of content,” he says.

Other forms of disinformation campaigns often involve creating content with flat out wrong information, he says, such as promoting an incorrect date for Election Day or telling people they can text in their vote.

“These are all things that are really meant to mislead people and make it more difficult for them to exercise their vote,” he says.

The 2020 presidential election is fast approaching on the heels of massive protests against police violence and anti-Black racism across the country. And similar to what happened with Russian interference in 2016, Banks says the current campaigns to misinform Black and Latino people aren’t a coincidence.

“It's not an accident that as our political power grows in the country, as our numbers grow in the country, also these kinds of attacks focused on our communities also grow,” he says. “These attacks make it harder for us to actually express our political power in the voting booth.”

His team has noticed many forms of disinformation and “outright lies” surfacing around the recent protests, he says. Hundreds of thousands of social media posts have appeared in the mainstream about violence at peaceful demonstrations, he says, oftentimes misrepresenting the purpose of the protest.

Countries such as Russia, Iran and China are purposefully stoking racial tensions — specifically preying on Black and Brown communities in the U.S. — because these voters have the power to swing elections, Banks says.

“We sometimes undervalue our own vote,” he says. “And these other people — people sitting across an ocean — actually realize that we, our individual votes, can really make a difference about who's in the White House.”

That was made clear when Hillary Clinton ran against Donald Trump. Russian operatives were intervening on behalf of Trump for a number of reasons, and “as a result, Black Americans ended up being the No.1 target of misinformation in 2016,” he says.

But misinformation campaigns don’t spare white Americans. They often use similar divisive tropes and encourage conspiracy theories against Democrats — which is ultimately a threat to all communities, Banks says.

Facebook and Twitter are at the forefront of the conversation around misinformation and the upcoming national election. This year, Facebook deleted a page that used LeBron James as an image to spread false information about mail-in voting. While there is a clear attempt by social media platforms to stay ahead of this, Banks says companies need to do a lot more.

“I think what we've seen is like more talking about the problem than activating a solution,” he says.

Social networking companies need to be more proactive about identifying bad actors and eliminating them as soon as possible, he advises. For example, his team found an obvious bot that was generating thousands of comments within hours on a post — and yet, it took weeks of work for the account to be removed.

“I think these platforms have really an outsized effect on our political conversation. They have an outsized impact on how our political discourse happens,” he says. “And so they need to put an outsized number of resources toward making sure they're addressing this really, really critical problem that's often even more insidious than may be obvious on the surface.”

Education and news literacy are key to prevent future misinformation campaigns from escalating, Banks says. This starts with policing our own feeds.

To determine whether a piece of content is legitimate or not, Banks says to first look at the source. Do some clicking through the website and ask yourself whether it seems real, professional and credible.

Sharing misinformation isn’t just the work of bots and foreign agents. People are moving at such a rapid pace that sometimes we share information that doesn’t reflect the full truth, he says. Before hitting the share button, reflect on what you’re putting forward to your network.

Lastly, pick and choose your battles online, Banks says. Bots lure people into fights in order to work the social platform’s algorithm to increase visibility.

While these online arguments usually go nowhere, they do “actually have the impact of spreading misinformation,” he says. “Before you get into that fight, make sure that it's somebody that's really trying to argue with you around something that's actually real and matters.”

Cristina Kim produced and edited this interview for broadcast with Tinku RaySerena McMahon adapted it for the web.

This segment aired on September 2, 2020.


Headshot of Tonya Mosley

Tonya Mosley Correspondent, Here & Now
Tonya Mosley was the LA-based co-host of Here & Now.


Headshot of Serena McMahon

Serena McMahon Digital Producer
Serena McMahon was a digital producer for Here & Now.



More from Here & Now

Listen Live