Support the news

Facebook And The First Amendment: Policing Free Speech On The Platform46:44
Download

Play
Facebook CEO Mark Zuckerberg arrives to testify before a House Financial Services Committee hearing on Capitol Hill in Washington, Wednesday, Oct. 23, 2019, on Facebook's impact on the financial services and housing sectors. (Andrew Harnik/AP)
Facebook CEO Mark Zuckerberg arrives to testify before a House Financial Services Committee hearing on Capitol Hill in Washington, Wednesday, Oct. 23, 2019, on Facebook's impact on the financial services and housing sectors. (Andrew Harnik/AP)

Mark Zuckerberg wraps Facebook around the First Amendment and says it won’t police free speech and ads. But does Facebook’s own behavior tell a different story?

Guests

Tony Romm, senior tech policy reporter at the Washington Post. (@TonyRomm)

Jennifer Grygiel, professor of communications at Syracuse University’s Newhouse School of Public Communications. (@jmgrygiel)

Roslyn Layton, visiting scholar at the American Enterprise Institute. (@RoslynLayton)

From The Reading List

Washington Post: "Facebook’s Zuckerberg takes broad lashing on Libra, 2020 election and civil rights at congressional hearing" — "Congressional lawmakers delivered a broad lashing of Facebook chief executive Mark Zuckerberg on Wednesday, sniping at his company’s plans to launch a digital currency, its pockmarked track record on privacy and diversity, and its struggles to prevent the spread of misinformation online.

"The wide-ranging criticisms came largely from Democrats during a hearing at the House Financial Services Committee, which convened the session to probe Facebook’s plan to launch a cryptocurrency, called Libra. Facebook’s efforts have catalyzed a rare alignment of opposition from the party’s members of Congress and some Trump administration officials, who are concerned Libra could trouble the global financial system.

"Quickly, though, the hearing expanded in focus, reflecting the simmering frustrations on Capitol Hill with practically the entirety of Facebook’s business. Rep. Maxine Waters, the panel’s chairwoman, cited the news that Facebook removed from its platform a number of efforts to spread disinformation, including a Russian campaign predominantly on Facebook-owned Instagram that targeted users in swing states such as Florida. She said it showed foreign malefactors are 'at it again,' four years after Russians took aim at the 2016 race."

Nieman Lab: "Should Facebook have a “quiet period” of no algorithm changes before a major election?" — "Facebook’s News Feed algorithm determines what users see on its platform — from funny memes to comments from friends. The company regularly updates this algorithm, which can dramatically change what information people consume.

"As the 2020 election approaches, there is much public concern that what was dubbed 'Russian meddling' in the 2016 presidential election could happen again. But what’s not getting enough attention is the role Facebook’s algorithm changes play, intentionally or not, in that kind of meddling.

"A key counterpoint to the Russian misinformation campaign was factual journalism from reputable sources — which reached many of their readers on Facebook and other social media platforms. As a social media researcher and educator, I see evidence that changes to Facebook’s News Feed algorithm suppressed users’ access to credible journalism in the run-up to Trump’s election.

"Political operatives know Facebook serves as a gatekeeper of the information diets of more than 200 million Americans and 2 billion users worldwide. Actions and abuse by others on the platforms have generated much concern and public discussion, including about how much disinformation and propaganda Americans saw before the election. What has not been talked about enough is the effect that Facebook’s algorithmic shifts have had on access to news and democracy."

The Atlantic: "Facebook Restricts Speech by Popular Demand" — "This past week, with some fanfare, Facebook announced its own version of the Supreme Court: a 40-member board that will make final decisions about user posts that Facebook has taken down. The announcement came after extended deliberations that have been described as Facebook’s 'constitutional convention.'

"We don’t usually use sweeping terms such as Supreme Court and constitution to describe the operation of private companies, but here they seem appropriate. Internet platforms such as YouTube and Facebook have been called the modern public square. That description understates the platforms’ importance for the many people who use them in place of newspapers, TV stations, the postal service, and even money. People whose posts are removed from major platforms say they are being excluded from the most important communication channels of our age. We should care deeply about the rules these companies apply to our speech and behavior—whether PayPal should process donations to WikiLeaks, for example, or whether the security provider Cloudflare should protect neo-Nazi sites or 8chan, or whether Facebook should have taken down the famous Vietnam War photo of a naked girl fleeing her village.

"But private platforms aren’t really the public square, and internet companies aren’t governments. That’s exactly why they are free to do what so many people seem to want: set aside the First Amendment’s speech rules in favor of new, more restrictive ones. Messages we might once have heard from a soapbox in the park—including very troubling ones about dangers of vaccines, conspiracy theories, or racist agendas—can be banished from social-media platforms."

This program aired on October 24, 2019.

Related:

+Join the discussion
TwitterfacebookEmail

Support the news