Social Media platforms, Algorithm tools & Editorial responsibility

EditorialResponsibilityOn Tuesday August 30th (2016), it was reported that the German government had asked Facebook to remove hateful and illegal posts more quickly, as part of its corporate social responsibility. Social Media companies however are typically reluctant to be very proactive in their approach to such removal, preferring to rely on notifications from the users, because they do not want to be seen to edit the content that is shared since this might lead to them being labelled a publisher. The moment a social media company becomes a publisher it would become liable to media regulations and open to libel laws. This was also the position that Zuckerberg reaffirmed one day earlier during a Q&A in Italy where he said: “No, we’re a tech company, we’re not a media company,” Facebook builds “the tools, we do not produce any of the content.”

Even though it is true that social media platforms like Facebook do not have staff writers or commissioned freelances producing content for them, the increasing use of algorithmic tools that select and recommend content to the users is giving rise to a sense that the companies are engaging in ‘letter of the law’ vs ‘spirit of the law’ type of hair splitting when they claim that they do not engage in editorial behaviour.

In the not so distant future it may well be possible to develop an automated system that can commission news content, edit the presentation of articles and publish them in an automated newspaper. Would we accept that the creators of such a system, the people who set the parameters of its behaviour, were merely tool makers and therefore did not have any editorial responsibility? Is it this kind of logic for avoiding responsibility that prompted Facebook to fire the human editors of its Trending feature and replaced them with a less-than-perfect algorithm in the aftermath of complaints about possible political bias in the Trending Topics feed?

In July 2012 the Reuters Institute for the Study of Journalism published a report by into the role of digital intermediaries such as search engines, social networks and app stores in enabling users to access news sources. Based on an analysis of the way in which these intermediaries are used and the ways in which they operate the report argued that

“There are no exact parallels for the new digital intermediaries identified here – most are not neutral ‘pipes’ like ISPs, through which all internet content flows (although Twitter is close to this); nor are they pure media companies like broadcasters or newspapers, heavily involved in creative and editorial decisions. But they do perform important roles in selecting and channelling information, which implies a legitimate public interest in what they do.”

Facebook, for instance, controls the way in which its news feed is presented, and the priority it gives to news items, some of which may refer to news articles that friends have recently read. According to TechCrunch, Facebook “controls the news feed like an editor-in- chief controls a newspaper’s front page”. An important factor in this is the ‘walled garden’ architecture of social media platforms which contrasts to the open internet accessed via search engines and which gives the platform much greater control over the way in which the users are presented with information.

The report finds that social networks, search engines and apps stores act as gatekeepers that exert editorial-like judgements to varying degrees as they “sort and select content to provide news which is of ‘relevance’ to their customers, and decide which sources of news to feature prominently” by which they do affect the nature and range of news content that users have access to. To safeguard the public against undue influence or manipulation it is recommended that regulation should pursue the overarching principles of “open access, transparency of policies, and clear accountability for any decisions taken. Intermediaries should be encouraged, as many do already, to publish the criteria used in making access decisions, including access to news.

Approaches might include:

  • A requirement that digital intermediaries should guarantee that no news content or supplier will be blocked or refused access, unless for legal or other good reason, such reason to be explained with reference to publicly available criteria.
  • A requirement that digital intermediaries should carry or link to in a prominent position a range of news content deemed to be in the public interest (for example, a search engine could be asked to list at least x different news sources on the first page of a search, app stores could be asked to provide appropriate prominence to public-interest news over a period of time).
  • Establish an independent review body which could audit access practices and take complaints.

What of the argument that intermediaries like the social media platforms are merely creators of tools? The tools sort and recommend content, which may seem like editorial judgement, but it is done by objective algorithms, not humans. The algorithms merely select and sort results in a way which is aimed at giving users helpful and useful information. The idea that algorithms are ‘neutral’ because logical decisions are inherently free from prejudice is unfortunately deeply flawed since rational choices can only be made when the relative cost/benefit values of anticipated outcomes can be weighed against each other. These values must be pre-assigned and as such are a form of prejudice. When creating an algorithm, the decision rules and outcome values are either assigned by the human designers or derived from data that was selected by the designers. As such, the responsibility for editorial decisions made by an algorithmic tool devolve back to the creators and those who are in charge of setting the value parameters of the system, i.e. those who are in charge of the social media platforms.

Napoli, P. M. (2015). Social media and the public interest: Governance of news platforms in the realm of individual and algorithmic gatekeepers. Telecommunications Policy, 39(9), 751–760.

Social media platforms make day-to-day decisions about what content is allowed on their platform and the conditions under which this content should be removed. Intermediaries respond to incidents of cyberbullying and online harassment. They intervene in public controversies by censoring speech and terminating user accounts.

Social media platforms and the private companies that run them are potent because they have become vital components of the digital public sphere. How they design their platforms, how they allow content to flow, and how they agree to exchange information with competing platforms have direct implications for both communication rights and innovation. One objective of free expression is to create a communicative context necessary for the advancement of democracy. As the responsibility for the preservation of free speech shifts from public to private contexts, this will have implications not only for what counts as free expression online but also for democracy itself.

One thought on “Social Media platforms, Algorithm tools & Editorial responsibility”

Go on, leave us a reply!

This site uses Akismet to reduce spam. Learn how your comment data is processed.