The United Nations mandated University for Peace (UPEACE) recently launched a call for papers to contribute to an e-book “that examines how the uses of current technologies, and the development of new ones, can contribute to guarantee and protect human rights within the 2030 Agenda for Sustainable Development framework. Hence, considering that every Sustainable Development Goal aims to protect one or more human rights, and that states will rely on the use of technological innovations and global interconnectedness to implement the 2030 Agenda, we are looking for articles that explore one or more of the following topics:
At the heart of current online consumer protection is the concept of informed consent where by the prospective consumer makes a conscious decision to sign up to a service with full knowledge and consent to the consequences of doing so. Even in the newly signed EU General Data Protection Regulation, which will go into effect in 2018, this will not fundamentally change. For anyone who has ever used a commercial internet service however, and this included policy makers, it is glaringly obvious that there is a fundamental flaw in this approach, namely the assumption that the consumer has a good understanding of the contract that is being entered into.
This week saw the publication of the report on ‘Online platforms and the Digital Single Market’ by the House of Lords EU Internal Market Sub-Committee. This reports presents the findings of the inquiry that was held from October 2015 till spring 2016, receiving 85 written responses and 20 oral evidence sessions. Included in the written responses were two from Horizon Digital Economy Research, one by Prof. Rodden and one by myself which we partially posted on this blog about in October 2015. The main driver for this inquiry was the publication in May 2015 by the European Commission (EC) of its ‘Digital Single Market Strategy for Europe’ (DSM), which drew attention to the growing role of online platforms as key players in social and economic interactions on the internet, and was followed on 24 September by the launch of an EC consultation ‘A fit for purpose regulatory environment for platforms and intermediaries’. For the purposes of both the EC consultation and the Lords’ inquiry online platforms were considered to ‘represent a broad category of digital businesses that provide a meeting place for two or more different groups of users over the Internet, examples of which include search engines, online marketplaces, the collaborative or sharing economy, and social networks’. What follows is an incomplete summary of the findings in the report, with a focus on the issues related to fundamental rights of platform users (e.g. privacy), the role of algorithms and user consent, which are most closely related to our work at CaSMa.
The European Commission Directorate General for Communications Networks, Content & Technology, a.k.a DG Connect recently launched a survey (deadline April 10th 2016) on the ‘future of the internet’ as part of its Net Futures agenda, which was established to “pioneer and coordinate research, innovation, and policy initiatives on what lies beyond the current internet architecture, software and services.” Below is a copy of my submission to the survey.
While exploring the Internet And Human Rights Resources Center at the Internet Society, I encountered a highly informative report from the Global Commission on Internet Governance (CIGI), which was published in January 2016, on the extent to which the management of individuals’ fundamental rights, e.g. privacy and free speech, is in the hands corporations.
The report presents an excellent overview of the various ways in which the dominance of a small set of companies that control the web platforms where most people spend the majority of their time online has produced a situation where these companies have become major actors in determining the state of human rights online.
Apple is not willing to weaken its hard-to-crack encryption protocols after being asked by a magistrate judge to build a weaker new version of its mobile operating system. This request comes from the FBI as an attempt to access the content of the iPhone used by one of the gunmen involved in the San Bernardino shooting. It has been argued that law enforcement officials wanted to use this high-profile act of terrorism as a way to create political pressure on Apple to comply with the FBI demands and set a dangerous precedent for future cases.
On 25th February 2016, the Digital4EU Stakeholder Forum, organized by the European Commission, took place in Brussels. This one day conference was centred on the progress made in creating a Digital Single Market in Europe. The agenda of the day started with a pre-conference breakfast session about the European Fund for Strategic Investment (EFSI)’s financing opportunities for digital projects, especially for extending the roll-out of Broadband internet connections in rural areas that have not yet achieved full internet penetration.
According to the General Data Protection Regulation (GDPR,) information society services that wish to process any personal information related to a child under the age of 16 years will require parental/guardian consent. The GDPR is the European Commission’s tool that will unify data protection in the EU and there are plans for it to be adopted in 2018. In the most recent GDPR draft released by the European Council, the age limit where parental consent is mandatory has raised from 13 to 16 years. The implications for children digital rights are not well understood and, at the moment, nobody knows if this regulation will protect children or by the contrary make them more vulnerable. Something certain is that, until now, minimal consultation to incorporate the children voice has taken place and consequently, children’s digital rights are not being treated with the respect or seriousness they deserve.
Young people are a highly vulnerable group on social media. Current research (summarised here https://www.nspcc.org.uk/services-and-resources/research-and-resources/2015/how-safe-are-our-children-2015/ ) suggests that 1 in 3 have been victims of cyber bullying and 1 in 4 have experienced something upsetting on a social media site. The ‘Digital Wildfire’ project (www.digitalwildfire.org) explores the spread of provocative and antagonistic content on social media and seeks to identify opportunities for the responsible governance of digital social spaces. As part of this we have spent time engaging with young people and school teachers to find out their views on social media and the harms it can cause.
In celebration of Data Protection Day (also known as Data Privacy Day), please join us for the launch of our #AnalyzeMyData campaign on Twitter. Through this campaign we hope to increase public awareness of the ways in which data is used/misused and establish an evidence base of public opinion on these issues that can be used to support future policy discussions around improved guidelines and regulations for data access consent.