Facebook is denying that it is censoring the news and trends it shows on the website’s trending section. The denial comes on the heels of a Gizmodo story reporting that Facebook’s human curators censor conservative news stories from appearing in the Trending Topics section. The report included a claim that those curators were encouraged to “artificially inject” stories into the trending section even if they weren’t popular enough to merit inclusion.
Facebook executive Tom Stocky, posted a lengthy rebuttal to the Gizmodo report that claimed none of those things are true. Stocky said that topics are surfaced via an algorithm, and human editors are used to sift through those topics to ensure their relevance and “disregard junk or duplicate topics, hoaxes or subjects with insufficient sources,” but are not asked to add their own stories.
Conservatives were up in arms about the social network’s role in promoting certain stories. Facebook has long argued that it is a neutral platform, and that the almighty algorithm decides what content is suggested and appears in peoples’ feeds. Gizmodo’s report certainly put that idea on its head. Facebook drives a lot of traffic for publishers, but more importantly, its algorithm determines what information its 1.6 billion users see every month. A biased algorithm, or biased curators, could potentially push a specific political or social agenda. Earlier in the evening Republican National Committee Chairman Reince Priebus started a petition demanding Facebook respond to the story.
The company’s moderation team is notorious for its heavy-handed approach to topics like nudity, even as it also gets slated by governments worldwide for not removing and reporting content glorifying terrorism rapidly enough. But being a community moderator at Facebook is a thankless task. The work, often outsourced to companies like Manila-based contractor TaskUs, is performed with little remuneration or training. And even the best-paid highly skilled employees would have trouble drawing a consistent plan of action out of Facebook’s vague attempts at drawing up community standards.
A 2015 study suggested that more than 60% of Facebook users are entirely unaware of any algorithmic curation on Facebook at all: “They believed every single story from their friends and followed pages appeared in their news feed”, the authors wrote. These decisions don’t feel outrageous, because Facebook sells them under the veneer of neutrality.
Articles with a longer read time aren’t shown because Facebook made an editorial decision that you shouldn’t read short pieces; instead it’s because “the time people choose to spend reading or watching content they clicked on from news feed is an important signal that the story was interesting to them”. And so Facebook promotes stories with a high read time, because it wants the news feed to be full of “interesting” stories.