Posted on 25/May/2016
FacebookTwitterGoogle+WhatsAppEvernotePocketKindle ItBufferLinkedIn
x
Bookmark

Facebook newsroom should take a page from the accumulated experience in hundreds of years of press ethics, and a couple of decades of video games. Its first move should be to be transparent about its news algorithm and its priorities.

The tech community loves to make up laws to describe certain phenomena, such as Moore’s law which predicts growth in computer power and the perhaps more humoristic Godwin’s law which says that any online discussion long enough will end up with one comparing the other to Hitler.

But in order to understand the digital world, probably the most important of these laws would be Metcalfe’s law.

It says that the value of a network increases with the square of the number of members (or nodes), which by extension means that the price of staying outside the network increases with every new member.

This can be good news, for auctions or advert listings it’s convenient to have it all in one place. The downside is of course that it spawns very powerful dominant niche players (cue Vestager vs Google).

No business knows better how to game Metcalfe’s law than Facebook. With some 1,6 Billion users, the point where anyone realistically can opt-out has been passed long ago.

Far from the naïve place of birthday greetings and flirty pokes it may have once been, Facebook today is more like an operating system for interaction far beyond the social life: marketers use it to build hype and businesses to interact with customers, but also dictators use it to spread propaganda and terrorist organisatons to distribute beheading videos.

It cannot be easy to be Mark Zuckerberg: one day your service is believed to bring democracy to the Middle East through its sheer existence, the next you have to travel to Germany to make apologies for Nazi hate speech.

If you’re a global service, you face the problem of different rules in different jurisdictions. So far, Silicon Valley has successfully played the “safe harbour” card, saying they can’t control what users post. (If all else fails, play the algorithm card – as in “we don’t really know what it does”!).

This is not really saying “we take no responsibility” but rather a way to make their own rules. Convenient for businesses, problem is other people may disagree. And the deeper you get involved in a culture, the more difficult it gets to surf above the clouds.

These trends come together as Facebook’s power over the news becomes more evident.

Depending on what Facebook decides to show in its users’ feeds, it wields a lot of influence over the digital public sphere. The current debate about Facebook’s alleged anti-conservative bias hints at a much bigger issue.

When we ask “how can we know if the gravitation toward anti-Trump stories is a result of user preference or algorithm settings?”, we’re really asking questions such as: What rules and principles apply to Facebook’s news feed algorithm? Who is the editor in charge? Does Facebook subscribe to normal press ethics such as verifying stories with more than one source and hearing both sides of an issue?

Such basic things that are taught at every journalism school, developed over decades, centuries even, of free press. Systems of self-regulatory ethics bodies continuously evaluate and evolve these learnings, tweaking which publishing decisions are criticised and which are not.

The details of the formal systems may vary from country to country, but the principles are the same and around them is a living conversation in the professional journalist community about when to publish a story and when not to, balancing the interests of privacy (for example of crime victims) and the public’s right to information.

It is tempting to arrive at the conclusion that the internet users should be better advised and not share hoax stories and be sceptic of the sources, but that is the easy way out.

If journalists with years of education and the ethics of the professional community to draw from find these decisions difficult enough to deserve seminars, ethics committees, even specialist magazines and radio shows, how could we ever expect the average social media user to take such a responsibility?

The answer will always be that the organisation that delivers the news is responsible for the content. Mass distribution with no editorial responsibility is a recipe for disaster.

In 2012 in Gothenburg, Sweden, teenagers’ use of social media for sexual bullying and hate speech spiralled out of control and led to beatings and even street fights in what became known as the “Instagram riots”.

When The Pirate Bay posted autopsy photographs from a court case involving two children who had been murdered with a hammer, much to the horror of the Swedish public and not least the victims’ family, its spokesperson claimed the photographs were on public record and therefore could be distributed without limitation.

With normal press ethics, neither of these events would have happened. Editors would have stopped them.

When Wikileaks released diplomatic cables and military files, it exposed horrible abuse but also made public the names of local Western sympathisers, putting them at risk of vengeance from insurgents.

Edward Snowden learned from this and wisely released his leaks through established news outlets. The recent Panama papers leak is an even better example of responsible journalism, where hundreds of journalists worked together on the material before anything was made public.

But how can a service like Facebook use any of this?

It’s their users who post and share the material after all, not Facebook itself. The algorithm aside, Facebook could also learn from video games.

That’s right, many games offer both discussion forums, user generated content and in-game chat channels. As games companies try to keep a good atmosphere, avoid hate speech and sexism, as a game becomes popular it quickly becomes impossible for the game company to monitor all the content and understand all languages.

Also the normal functions such as reporting abuse and blocking users are often not enough and can themselves be abused. Instead, many game companies give to selected users moderator privileges, delegating the editorial responsibilities to trusted players. (In fact, this is the same model Google applies to its trouble-shooting forums where users help other users.)

The beauty is that it can scale infinitely, even with Billions of users. Facebook probably cannot simply copy that model, but it can use it for its newsroom service.

In traditional media, pluralism is perhaps the most important vaccine against media bias. With plenty of different publications available, there is always another view available. It is no coincidence the Soviet regime preferred to have only one news publication: Pravda (“The Truth” in Russian).

With the mechanics of Metcalfe’s law, pluralism online becomes a challenge.

As Facebook benefits particularly from that phenomenon, it has an even greater responsibility to uphold pluralism on its platform. It could start by looking at what has worked for the press and for video games.

But its first move should be to be transparent about its news algorithm and its priorities. After all, Facebook asks complete transparency of its users.

 

Picture credits: forzadagro