On March 21st the House of Lords Communications Committee inquiry on Children and the Internet published its report, which incorporated a number of findings that came out or our Youth Juries engagement with 13-17 years old ‘digital natives’.
The UnBias team is pleased to announce the launch of a ground-breaking report that articulates the voice of children and young people, and their relationship to the internet and digital technologies.
The launch will take place at the House of Lords next 31st of January and it be presented by Baroness Beeban Kidron, Prof Stephen Coleman from Leeds University and Elvira Perez from the UnBias team. Children and young people will be attending the launch and contributing the Q&A session.
This report is titled ‘The Internet on our Own Term: How Children and Young People Deliberated about their Digital Rights’ and describes the work carried since April 2015 in which young people aged between 12 and 17 gathered together in the cities of Leeds, London and Nottingham to participate in a series of jury-styled focus groups designed to ‘put the internet on trial’. In total, nine juries took place which included 108 young people, approximately 12 participants per jury.
Building on the results from our work on the iRights Youth Juries, CaSMa responded to the call for evidence from the to House of Lords Communications Committee “Children and the Internet” inquiry. Following our submission at the end of August, Professor Derek McAuley was invited to give verbal evidence, which took place on October 11th [transcript] [video].
On June 14th CaSMa and Gada organized a joint workshop to explore the “youth civic engagement in the digital age”, which was funded by a seed-grant from the Governance and Public Policy RPA. The purpose of this workshop was to explore definitions and understanding around what youth civic engagement is (and also what is not), what motivates young people to engage and how to reach out to those whose voice is not being heard.
On Tuesday August 30th (2016), it was reported that the German government had asked Facebook to remove hateful and illegal posts more quickly, as part of its corporate social responsibility. Social Media companies however are typically reluctant to be very proactive in their approach to such removal, preferring to rely on notifications from the users, because they do not want to be seen to edit the content that is shared since this might lead to them being labelled a publisher. The moment a social media company becomes a publisher it would become liable to media regulations and open to libel laws. This was also the position that Zuckerberg reaffirmed one day earlier during a Q&A in Italy where he said: “No, we’re a tech company, we’re not a media company,” Facebook builds “the tools, we do not produce any of the content.”
Weaponisation of artificial intelligence (AI) presents one of the greatest ethical and technological challenges in the 21st century and has been described as the third revolution in warfare, after the invention of gunpowder and nuclear weapons. Despite the vital importance of this development for modern society, legal and ethical practices, and technological turning point, there is little systematic study of public opinion on this critical issue. This interdisciplinary project addresses this gap. Our objective is to analyse what factors determine public attitudes towards the use of fully autonomous weapons. To do this, we put the public at the center of the policy debate, starting with youth engagement in political and decision-making processes.
Have you ever actually read the terms and conditions before signing up to a website or ordering something online? These long, wordy documents are a form of consumer protection designed to make sure we are fully informed when we agree to an online contract. They are supposed to ensure we are making a conscious decision to sign up to a service with full knowledge of the consequences.
On June 16th we joined civil society organizations like Privacy International, the European Digital Rights association EDRi and various others for a half-day civil society summit organized by the European Data Protection Supervisor (EDPS). On the agenda were a brief overview of the “Big Issues in Privacy and Data Protection in 2016” by Joe MCNamee of EDRi followed by three one-hour sessions on “Implementation of the GDPR, consistency, flexibility, guidelines” introduced by Anna Fielder (Privacy International); “Reform of e-Privacy Directive: What’s at stake?” introduced by Prof. Ian Brown (Oxford Internet Institute); and “Necessity and proportionality and data protection” introduced by Ralf Bendrath (German Working Group on Data Retention and Digitale Gesellschaft).
In the shadow of the ongoing debate over other Investigative Powers Bill, a debate where much of the rhetoric has been predominantly framed in terms of anti-terrorism and national-security, the National Crime Agency is currently busy with its own internal ‘future scoping’ exercise to examine the UK law enforcement community’s efforts regarding interceptions of communications and associated data. At the heart of this exercise is the question of identifying the boundaries of acceptability of such communications interceptions that delimits ‘policing by consent’ in the fight against serious and organized crime in a democratic society.