• Media

    Pravda2

    What Facebook can learn from video games and press ethics

    Facebook newsroom should take a page from the accumulated experience in hundreds of years of press ethics, and a couple of decades of video games. Its first move should be to be transparent about its news algorithm and its priorities. The tech community [read more]
    byPer Strömbäck | 25/May/20167 min read
    FacebookTwitterGoogle+WhatsAppEvernotePocketKindle ItBufferLinkedIn
    x
    Bookmark

    Facebook newsroom should take a page from the accumulated experience in hundreds of years of press ethics, and a couple of decades of video games. Its first move should be to be transparent about its news algorithm and its priorities.

    The tech community loves to make up laws to describe certain phenomena, such as Moore’s law which predicts growth in computer power and the perhaps more humoristic Godwin’s law which says that any online discussion long enough will end up with one comparing the other to Hitler.

    But in order to understand the digital world, probably the most important of these laws would be Metcalfe’s law.

    It says that the value of a network increases with the square of the number of members (or nodes), which by extension means that the price of staying outside the network increases with every new member.

    This can be good news, for auctions or advert listings it’s convenient to have it all in one place. The downside is of course that it spawns very powerful dominant niche players (cue Vestager vs Google).

    No business knows better how to game Metcalfe’s law than Facebook. With some 1,6 Billion users, the point where anyone realistically can opt-out has been passed long ago.

    Far from the naïve place of birthday greetings and flirty pokes it may have once been, Facebook today is more like an operating system for interaction far beyond the social life: marketers use it to build hype and businesses to interact with customers, but also dictators use it to spread propaganda and terrorist organisatons to distribute beheading videos.

    It cannot be easy to be Mark Zuckerberg: one day your service is believed to bring democracy to the Middle East through its sheer existence, the next you have to travel to Germany to make apologies for Nazi hate speech.

    If you’re a global service, you face the problem of different rules in different jurisdictions. So far, Silicon Valley has successfully played the “safe harbour” card, saying they can’t control what users post. (If all else fails, play the algorithm card – as in “we don’t really know what it does”!).

    This is not really saying “we take no responsibility” but rather a way to make their own rules. Convenient for businesses, problem is other people may disagree. And the deeper you get involved in a culture, the more difficult it gets to surf above the clouds.

    These trends come together as Facebook’s power over the news becomes more evident.

    Depending on what Facebook decides to show in its users’ feeds, it wields a lot of influence over the digital public sphere. The current debate about Facebook’s alleged anti-conservative bias hints at a much bigger issue.

    When we ask “how can we know if the gravitation toward anti-Trump stories is a result of user preference or algorithm settings?”, we’re really asking questions such as: What rules and principles apply to Facebook’s news feed algorithm? Who is the editor in charge? Does Facebook subscribe to normal press ethics such as verifying stories with more than one source and hearing both sides of an issue?

    Such basic things that are taught at every journalism school, developed over decades, centuries even, of free press. Systems of self-regulatory ethics bodies continuously evaluate and evolve these learnings, tweaking which publishing decisions are criticised and which are not.

    The details of the formal systems may vary from country to country, but the principles are the same and around them is a living conversation in the professional journalist community about when to publish a story and when not to, balancing the interests of privacy (for example of crime victims) and the public’s right to information.

    It is tempting to arrive at the conclusion that the internet users should be better advised and not share hoax stories and be sceptic of the sources, but that is the easy way out.

    If journalists with years of education and the ethics of the professional community to draw from find these decisions difficult enough to deserve seminars, ethics committees, even specialist magazines and radio shows, how could we ever expect the average social media user to take such a responsibility?

    The answer will always be that the organisation that delivers the news is responsible for the content. Mass distribution with no editorial responsibility is a recipe for disaster.

    In 2012 in Gothenburg, Sweden, teenagers’ use of social media for sexual bullying and hate speech spiralled out of control and led to beatings and even street fights in what became known as the “Instagram riots”.

    When The Pirate Bay posted autopsy photographs from a court case involving two children who had been murdered with a hammer, much to the horror of the Swedish public and not least the victims’ family, its spokesperson claimed the photographs were on public record and therefore could be distributed without limitation.

    With normal press ethics, neither of these events would have happened. Editors would have stopped them.

    When Wikileaks released diplomatic cables and military files, it exposed horrible abuse but also made public the names of local Western sympathisers, putting them at risk of vengeance from insurgents.

    Edward Snowden learned from this and wisely released his leaks through established news outlets. The recent Panama papers leak is an even better example of responsible journalism, where hundreds of journalists worked together on the material before anything was made public.

    But how can a service like Facebook use any of this?

    It’s their users who post and share the material after all, not Facebook itself. The algorithm aside, Facebook could also learn from video games.

    That’s right, many games offer both discussion forums, user generated content and in-game chat channels. As games companies try to keep a good atmosphere, avoid hate speech and sexism, as a game becomes popular it quickly becomes impossible for the game company to monitor all the content and understand all languages.

    Also the normal functions such as reporting abuse and blocking users are often not enough and can themselves be abused. Instead, many game companies give to selected users moderator privileges, delegating the editorial responsibilities to trusted players. (In fact, this is the same model Google applies to its trouble-shooting forums where users help other users.)

    The beauty is that it can scale infinitely, even with Billions of users. Facebook probably cannot simply copy that model, but it can use it for its newsroom service.

    In traditional media, pluralism is perhaps the most important vaccine against media bias. With plenty of different publications available, there is always another view available. It is no coincidence the Soviet regime preferred to have only one news publication: Pravda (“The Truth” in Russian).

    With the mechanics of Metcalfe’s law, pluralism online becomes a challenge.

    As Facebook benefits particularly from that phenomenon, it has an even greater responsibility to uphold pluralism on its platform. It could start by looking at what has worked for the press and for video games.

    But its first move should be to be transparent about its news algorithm and its priorities. After all, Facebook asks complete transparency of its users.

     

    Picture credits: forzadagro
    FacebookTwitterGoogle+WhatsAppEvernotePocketKindle ItBufferLinkedIn
    x
    Bookmark
  • Media

    spaceinv

    Gamergate: What can we all learn from the controversy that rocked the gaming world

    It is not only about a "movement" of sexist nerds. The Gamergate controversy, which erupted one year ago, is revealing about a much wider "dark side" of the Internet life.   Last autumn, GamerGate shocked the games industry. While it may have mas [read more]
    byPer Strömbäck | 09/Sep/20155 min read
    FacebookTwitterGoogle+WhatsAppEvernotePocketKindle ItBufferLinkedIn
    x
    Bookmark

    It is not only about a “movement” of sexist nerds. The Gamergate controversy, which erupted one year ago, is revealing about a much wider “dark side” of the Internet life.

     

    Last autumn, GamerGate shocked the games industry. While it may have masqueraded as an online debate on press ethics, the actual effect was to silence female journalists and academics who publicly criticized sexist depictions of women in games.

    Hundreds or thousands of anonymous web users made rape and death threats toward the handful of public women who were the targets and victims of GamerGate.

    In some cases, GamerGaters allegedly also paid visits in real life. Media scholar Anita Sarkeesian cancelled a speech at Utah State University following an email threatening a mass shooting would take place if she gave it.

    Game developer Brianna Wu had to flee her home after her address was posted on Twitter (alongside rape and murder threats).

    This is not an isolated event, anonymous haters online – or trolls – use social media to silence the voices of those they happen to disagree with, ironically often citing freedom of speech as a justification. Sexism is just one theme, racism may be even more popular.

    GamerGate started as a hashtag on the online forum 4Chan, famously connected to the Anonymous-movement – a sort of anarchist internet activists, some of whom may also be involved in GamerGate

    However, even the moderators of the notoriously liberal 4Chan decided that GamerGate went too far and kicked them out. The GamerGaters regrouped at a similar but even more lax online space called 8Chan (or InfiniteChan) which has hardly any rules whatsoever.

    There the actions against the likes of Sarkeesian and Wu where orchestrated, however the actual attacks were carried out mainly via Twitter using the #GamerGate-hashtag.

    Anyone who says something like “sticks and stones may break my bones, but words can never hurt me” or “freedom of speech is absolute and can also be used to defend oneself against hate speech” has never been on the receiving end of something like Gamergate and has a very limited understanding of freedom of expression.

    It is fair to express one’s own views, but not to try to abuse others into silence. I have met many who prefer to remain silent even on much less controversial topics such as piracy or vaccines from fear of threats or hate speech.

    Anonymity has something to do with it, but lack of consequence is a more important factor. Some of the cyberbullying directed against for example Sarkeesian was not anonymous, instead attackers bragged on forums how they had hacked her Wikipedia-page or posted porn images with her head pasted.

    The games industry was in shock. For many years, many parts of it had made great efforts to attract more women players and employees, as well as removing that age old stamp of sexism.

    The GamerGaters claimed they had the right to define who gets to play games and particularly have opinions about games. It went against every ambition of gender equality and all the progress made in the last decade. And the game world reacted.

    Sweden’s top game developers wrote an op-ed saying “not in the name of our games”. Thousands signed petitions. The mainstream media covered the story with little patience for the haters who hid in anonymity.

    Companies and organisations launched equality and diversity initiatives. Processor manufacturer Intel set aside 300 Million US-dollars towards equal opportunity initiatives. Some of these activities were already on the way, some were a consequence of GamerGate.

    But the most important actions may have been much humbler. Many game companies changed the rules on their forums, making consequences clearer and more strictly enforced by moderators.

    In an online world without consequence, it is only too easy to post before thinking, more often than not exaggerating to impress other users.

    The tone on many game forums may certainly have contributed to the Gamergate attitudes. But the game forums are also part of the solution.

    Other social media could learn from how active moderation and clear rules can form a climate where respect and freedom of speech prevail over hate and bullying. The game world learned it the hard way.

     

     photo credit: PJ Rey
    FacebookTwitterGoogle+WhatsAppEvernotePocketKindle ItBufferLinkedIn
    x
    Bookmark