This month’s kerfuffle about how Facebook uses a mix of algorithms and human beings to determine what “Trending Topics" appear in the upper right corner of everyone’s Facebook home pages raised a fascinating set of questions. Is there a systematic liberal bias in the choices made by the content “curators” — a small group of about a dozen journalists who choose what stories make it onto the trending bar? Are the working conditions of these people emblematic of the slow, painful death of journalism? And if Mark Zuckerberg’s vision really is actualized — a vision in which readers never have to leave Facebook to get their news, opinions and entertainment (to the extent that a distinction still exists between them) — will that signal the death of commercial publishing for all but a handful of major players?
Based on a quick sampling of a range of sources — from Glenn Beck’s surprising and weird embrace of Zuckerberg’s good intentions and denunciation of his fellow conservatives for acting just like liberals, to Emily Tow’s thoughtful and comprehensive analysis of how Facebook is “swallowing” journalism — the answers to these questions seem to be no, yes and maybe.
According to interviews that Gizmodo conducted with some former contractors, the bias at work was more commercial than political. Curators were asked to write neutral headlines, select articles that seemed to be factual and credible based on their appearance in multiple “establishment” news outlets like The New York Times, Time, Fox News, CNN and Buzzfeed, and avoid clearly partisan or even demographically targeted sites on both the left and the right. More explicit were the requirements to show fealty to Facebook’s business interests by promoting only videos that had been uploaded to the site and avoiding any mention of Twitter by name in headlines and summaries.
Nonetheless, in an effort to correct or appease (depending on your perspective), Facebook on Monday announced that it will further limit the role of its human curators, retrain employees and impose new "controls and oversight" to try to cut down the chances for bias. The company will also discontinue use of the 10 media outlets it used to confirm the reliability and importance of a given story.
there’s a distinction between news and content, and the blurring between the two isn’t a mere semantic problem. It’s a dangerous devaluation of real research and reporting.
As for whether Facebook is exploiting or mistreating the dozen or so contractors who “curate” content, if benefits and working conditions are the measure, then no, not really. But I think the more informative question is not how they are treated, but what they are hired to do. Let me illustrate. The second top trending topic on my own Facebook page at this moment are that Chris Brown, a musician I’m too old and out of it to have heard of, has criticized Nia Guzman, another celebrity unknown to me, for photos of her daughter Royalty’s dress. (That headline appears just below one about the reported deaths of two Mount Everest climbers, and just above a story about a Los Angeles DJ, also unknown to me, being injured in a car crash.)
Whether this motley collection of clickbait is attributable to today’s crew of human curators or the algorithm they’re training, it’s evident that these journalists are not using their reporting or even writing skills. Rather, they are cherry-picking topics and sites for “consumable content” that will ultimately drive page views and ad revenue, with a little going to the original news sources, but undoubtedly more going to Facebook itself. As my journalist husband, Mark Schlack, observed in a lively email dialogue with our digital native daughter, Layla Schlack, also a journalist,
“When you scratch the surface, you realize that what’s really happening is that they’re taking real journalists out of the wild, so to speak, and isolating them in a room. Within a year, not being involved in actual reporting or research about a beat, they’ll be no more qualified than a knowledgeable reader to pick what’s relevant.”
After all, as Layla observes, there’s a distinction between news and content, and the blurring between the two isn’t a mere semantic problem. It’s a dangerous devaluation of real research and reporting. “Facebook is in the content business, which is pretty much antithetical to the news business,” she writes.
“The thing that makes news news is editorial judgment. Content is driven by what readers/consumers want. One is reporters and editors saying ‘We’re the experts, and we’re telling you this is important,’ while the other consists of outlets of various stripes saying ‘Our data show that this is what you, the non-expert consumer, like, so here you go!’ … If there's one thing Facebook is good at, it's harvesting data on the wants and habits of their users, and serving that up to advertisers. This is just a new way for them to monetize that. It's an Ouroboros.”
Layla's metaphor of a serpent eating its own tale is a prescient one, and highlights the third and most durable concern surfaced by Facebook’s drive to become people’s primary source of news. Back in the early days, the Internet was rightly celebrated as a place that countered the dominance of a few mainstream products and markets by allowing the “long tail” of niche content to endure. But by having the enormous, diversified online collection of news, feature and opinion sites filtered into a handful of links on a Facebook page that readers will be increasingly unmotivated to leave, the promise of the worldwide web risks being turned into its opposite.
“The Internet was supposed to be about disintermediating the gatekeepers,” Mark notes. “Google is the biggest gatekeeper in history and now Facebook wants to join them.”
As readers, we’ll no doubt take advantage of those trending topic links (if and when they’re a tad more illuminating than those in my feed today). But depend solely on them to inform our worldview? That would be a step backwards for press and citizenry alike.