On March 21st the House of Lords Communications Committee inquiry on Children and the Internet published its report, which incorporated a number of findings that came out or our Youth Juries engagement with 13-17 years old ‘digital natives’.
The UnBias team is pleased to announce the launch of a ground-breaking report that articulates the voice of children and young people, and their relationship to the internet and digital technologies.
The launch will take place at the House of Lords next 31st of January and it be presented by Baroness Beeban Kidron, Prof Stephen Coleman from Leeds University and Elvira Perez from the UnBias team. Children and young people will be attending the launch and contributing the Q&A session.
This report is titled ‘The Internet on our Own Term: How Children and Young People Deliberated about their Digital Rights’ and describes the work carried since April 2015 in which young people aged between 12 and 17 gathered together in the cities of Leeds, London and Nottingham to participate in a series of jury-styled focus groups designed to ‘put the internet on trial’. In total, nine juries took place which included 108 young people, approximately 12 participants per jury.
Building on the results from our work on the iRights Youth Juries, CaSMa responded to the call for evidence from the to House of Lords Communications Committee “Children and the Internet” inquiry. Following our submission at the end of August, Professor Derek McAuley was invited to give verbal evidence, which took place on October 11th [transcript] [video].
On June 14th CaSMa and Gada organized a joint workshop to explore the “youth civic engagement in the digital age”, which was funded by a seed-grant from the Governance and Public Policy RPA. The purpose of this workshop was to explore definitions and understanding around what youth civic engagement is (and also what is not), what motivates young people to engage and how to reach out to those whose voice is not being heard.
Weaponisation of artificial intelligence (AI) presents one of the greatest ethical and technological challenges in the 21st century and has been described as the third revolution in warfare, after the invention of gunpowder and nuclear weapons. Despite the vital importance of this development for modern society, legal and ethical practices, and technological turning point, there is little systematic study of public opinion on this critical issue. This interdisciplinary project addresses this gap. Our objective is to analyse what factors determine public attitudes towards the use of fully autonomous weapons. To do this, we put the public at the center of the policy debate, starting with youth engagement in political and decision-making processes.
Apple is not willing to weaken its hard-to-crack encryption protocols after being asked by a magistrate judge to build a weaker new version of its mobile operating system. This request comes from the FBI as an attempt to access the content of the iPhone used by one of the gunmen involved in the San Bernardino shooting. It has been argued that law enforcement officials wanted to use this high-profile act of terrorism as a way to create political pressure on Apple to comply with the FBI demands and set a dangerous precedent for future cases.
According to the General Data Protection Regulation (GDPR,) information society services that wish to process any personal information related to a child under the age of 16 years will require parental/guardian consent. The GDPR is the European Commission’s tool that will unify data protection in the EU and there are plans for it to be adopted in 2018. In the most recent GDPR draft released by the European Council, the age limit where parental consent is mandatory has raised from 13 to 16 years. The implications for children digital rights are not well understood and, at the moment, nobody knows if this regulation will protect children or by the contrary make them more vulnerable. Something certain is that, until now, minimal consultation to incorporate the children voice has taken place and consequently, children’s digital rights are not being treated with the respect or seriousness they deserve.
CaSMa hosted, in collaboration with the Wildfires team, a workshop on how rumours, provocative content and (mis)information flows and goes viral on social media.
This is a serious problem which can lead to confrontation between freedom of information and societal safety. Self-regulation is a solution, however, how to promote self-regulation and awareness is the real challenge. Continue reading Digital Wildfires→