Web: The Bias of Facebook's Algorithms

May 2016 by: From The Web

Readers of this column can be assured that it is not written, at least for now, by a highly- capable computer algorithm, but by a (at least somewhat capable) human being. This is part of the unspoken deal between author and reader, person-to-person: you are free to agree or disagree, to find fault or accuse the writer of bias.

But what happens when the rules of engagement are less clear? What about when editorial decisions are not made by humans, but computers? Or when they are in fact made by humans, but appear not to be. This question has been at the centre of the storm that engulfed Facebook – a property with as much power as any publisher – last week.

Facebook, according to an anonymous individual who worked on a section of the website called “Trending Topics” – a box in the top right corner of the website featuring a smattering of news events supposedly reflecting what Facebook users were discussing – routinely suppressed stories from news organisations with conservative leanings

What’s more, several people who had worked in the section said they were asked to artificially “inject” stories into Trending Topics, even if the powerful algorithm that scans Facebook’s billions of conversations to take its collective heaving pulse did not propose them.

“In other words,” wrote Gizmodo, the technology website that broke the story, “Facebook’s news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation.”


Continue Reading: Telegraph