In the shadow of the ongoing debate over other Investigative Powers Bill, a debate where much of the rhetoric has been predominantly framed in terms of anti-terrorism and national-security, the National Crime Agency is currently busy with its own internal ‘future scoping’ exercise to examine the UK law enforcement community’s efforts regarding interceptions of communications and associated data. At the heart of this exercise is the question of identifying the boundaries of acceptability of such communications interceptions that delimits ‘policing by consent’ in the fight against serious and organized crime in a democratic society.
The United Nations mandated University for Peace (UPEACE) recently launched a call for papers to contribute to an e-book “that examines how the uses of current technologies, and the development of new ones, can contribute to guarantee and protect human rights within the 2030 Agenda for Sustainable Development framework. Hence, considering that every Sustainable Development Goal aims to protect one or more human rights, and that states will rely on the use of technological innovations and global interconnectedness to implement the 2030 Agenda, we are looking for articles that explore one or more of the following topics:
At the heart of current online consumer protection is the concept of informed consent where by the prospective consumer makes a conscious decision to sign up to a service with full knowledge and consent to the consequences of doing so. Even in the newly signed EU General Data Protection Regulation, which will go into effect in 2018, this will not fundamentally change. For anyone who has ever used a commercial internet service however, and this included policy makers, it is glaringly obvious that there is a fundamental flaw in this approach, namely the assumption that the consumer has a good understanding of the contract that is being entered into.
Cyberbullying is becoming an alarming problem among children and young people, not only because of its links to mental health and well-being but because it is so widespread that is becoming to be considered a worse problem among teenagers than drug abuse.
Not all news are disheartening though…
The European Commission Directorate General for Communications Networks, Content & Technology, a.k.a DG Connect recently launched a survey (deadline April 10th 2016) on the ‘future of the internet’ as part of its Net Futures agenda, which was established to “pioneer and coordinate research, innovation, and policy initiatives on what lies beyond the current internet architecture, software and services.” Below is a copy of my submission to the survey.
While exploring the Internet And Human Rights Resources Center at the Internet Society, I encountered a highly informative report from the Global Commission on Internet Governance (CIGI), which was published in January 2016, on the extent to which the management of individuals’ fundamental rights, e.g. privacy and free speech, is in the hands corporations.
The report presents an excellent overview of the various ways in which the dominance of a small set of companies that control the web platforms where most people spend the majority of their time online has produced a situation where these companies have become major actors in determining the state of human rights online.
On 25th February 2016, the Digital4EU Stakeholder Forum, organized by the European Commission, took place in Brussels. This one day conference was centred on the progress made in creating a Digital Single Market in Europe. The agenda of the day started with a pre-conference breakfast session about the European Fund for Strategic Investment (EFSI)’s financing opportunities for digital projects, especially for extending the roll-out of Broadband internet connections in rural areas that have not yet achieved full internet penetration.
According to the General Data Protection Regulation (GDPR,) information society services that wish to process any personal information related to a child under the age of 16 years will require parental/guardian consent. The GDPR is the European Commission’s tool that will unify data protection in the EU and there are plans for it to be adopted in 2018. In the most recent GDPR draft released by the European Council, the age limit where parental consent is mandatory has raised from 13 to 16 years. The implications for children digital rights are not well understood and, at the moment, nobody knows if this regulation will protect children or by the contrary make them more vulnerable. Something certain is that, until now, minimal consultation to incorporate the children voice has taken place and consequently, children’s digital rights are not being treated with the respect or seriousness they deserve.
On February 17th and 18th the Alan Turing Institute held a two day ‘scientific scoping workshop’ on Algorithm Society with the tag-line: “If data is the oil of the 21st century then algorithms are the engines that animate modern economies and societies by providing reflection, analysis and action on our activities. This workshop will look at how algorithms embed in and transform economies and societies and how social and economic forces shape the creation of algorithms.”
The workshop started with three talks covering FinTech (by Prof. Donald MacKenzie), human attitudes/expectations and willingness to use/trust algorithmic decisions (by Berkeley Dietvorst) and a proposal for a “Machine Intelligence Commission” to investigate and interrogate algorithm bias and compliance with regulations (by Geoff Mulgan).
In celebration of Data Protection Day (also known as Data Privacy Day), please join us for the launch of our #AnalyzeMyData campaign on Twitter. Through this campaign we hope to increase public awareness of the ways in which data is used/misused and establish an evidence base of public opinion on these issues that can be used to support future policy discussions around improved guidelines and regulations for data access consent.