On Tuesday August 30th (2016), it was reported that the German government had asked Facebook to remove hateful and illegal posts more quickly, as part of its corporate social responsibility. Social Media companies however are typically reluctant to be very proactive in their approach to such removal, preferring to rely on notifications from the users, because they do not want to be seen to edit the content that is shared since this might lead to them being labelled a publisher. The moment a social media company becomes a publisher it would become liable to media regulations and open to libel laws. This was also the position that Zuckerberg reaffirmed one day earlier during a Q&A in Italy where he said: “No, we’re a tech company, we’re not a media company,” Facebook builds “the tools, we do not produce any of the content.”
On June 2nd and June 3rd the biennale Danish conference on STS (DASTS16) was held at Aarhus, Denmark. The tagline for the conference was the “quintessential anti-determinist and anti-essentialist mantra of STS ‘It could have been different'”.
The United Nations mandated University for Peace (UPEACE) recently launched a call for papers to contribute to an e-book “that examines how the uses of current technologies, and the development of new ones, can contribute to guarantee and protect human rights within the 2030 Agenda for Sustainable Development framework. Hence, considering that every Sustainable Development Goal aims to protect one or more human rights, and that states will rely on the use of technological innovations and global interconnectedness to implement the 2030 Agenda, we are looking for articles that explore one or more of the following topics:
Over the last couple of weeks, Facebook has repeatedly had to defend itself against criticism about the way in which it makes editorial decisions when selecting stories that will appear on its ‘Trending Topics’ and ‘News Feed’.
The Science and Technology Committee is currently undertaking an inquiry into robotics and artificial intelligence (deadline for written submission was April 29th) as part of the continuing national strategy for Robotics and Autonomous Systems (RAS) innovation. In 2012 the UK Government identified RAS as one of the ‘Eight Great Technologies’, leading to the establishment of a ‘RAS Special Interest Group’ and the RAS national strategy in 2014. In 2015 the Special Interest Group published The UK Landscape for Robotics and Autonomous Systems and The Engineering and Physical Sciences Research Council also launched an UK-RAS network.
What follows is the response that was submitted by Ansgar Koene and Yohko Hatada.
This week saw the publication of the report on ‘Online platforms and the Digital Single Market’ by the House of Lords EU Internal Market Sub-Committee. This reports presents the findings of the inquiry that was held from October 2015 till spring 2016, receiving 85 written responses and 20 oral evidence sessions. Included in the written responses were two from Horizon Digital Economy Research, one by Prof. Rodden and one by myself which we partially posted on this blog about in October 2015. The main driver for this inquiry was the publication in May 2015 by the European Commission (EC) of its ‘Digital Single Market Strategy for Europe’ (DSM), which drew attention to the growing role of online platforms as key players in social and economic interactions on the internet, and was followed on 24 September by the launch of an EC consultation ‘A fit for purpose regulatory environment for platforms and intermediaries’. For the purposes of both the EC consultation and the Lords’ inquiry online platforms were considered to ‘represent a broad category of digital businesses that provide a meeting place for two or more different groups of users over the Internet, examples of which include search engines, online marketplaces, the collaborative or sharing economy, and social networks’. What follows is an incomplete summary of the findings in the report, with a focus on the issues related to fundamental rights of platform users (e.g. privacy), the role of algorithms and user consent, which are most closely related to our work at CaSMa.
The European Commission Directorate General for Communications Networks, Content & Technology, a.k.a DG Connect recently launched a survey (deadline April 10th 2016) on the ‘future of the internet’ as part of its Net Futures agenda, which was established to “pioneer and coordinate research, innovation, and policy initiatives on what lies beyond the current internet architecture, software and services.” Below is a copy of my submission to the survey.
On February 17th and 18th the Alan Turing Institute held a two day ‘scientific scoping workshop’ on Algorithm Society with the tag-line: “If data is the oil of the 21st century then algorithms are the engines that animate modern economies and societies by providing reflection, analysis and action on our activities. This workshop will look at how algorithms embed in and transform economies and societies and how social and economic forces shape the creation of algorithms.”
The workshop started with three talks covering FinTech (by Prof. Donald MacKenzie), human attitudes/expectations and willingness to use/trust algorithmic decisions (by Berkeley Dietvorst) and a proposal for a “Machine Intelligence Commission” to investigate and interrogate algorithm bias and compliance with regulations (by Geoff Mulgan).