Having previously been postponed due to Pre-election period Prudah restriction, the “Rebooting the Expert” a.k.a. “Routes to Policy Impact” event finally took place on July 6th 2017.
On Friday 21st October a botnet of hacked Internet or Things devices launched a massive DDoS attack on a DNS service provider causing major disruption to services like PayPal, Twitter and Netflix. To many security experts familiar with IoT it was only matter of time before this would happen. Assuming that this will act as a wakeup call, what can be done to improve IoT cybersecurity?
Our colleagues on the Digital Wildfires project teamed up with Oxford Sparks and Jason R.C. Nurse to produce a new video animation for young people about the joys and challenges of social media information. In the words of the Digital Wildfires team: “Keeping Social Media depicts the ways that social media have revolutionised the ways we communicate. Whiles these platforms open up an unimagined volume of ideas and possibilities they also offer anonymity, which increases the chance that both children and adults may take risks and experiment with behaviours they may not consider offline. Our video describes how research can help find ways to tackle some of the challenges posed by social media and invites the viewer to consider how these digital social spaces should be regulated.”
Building on the results from our work on the iRights Youth Juries, CaSMa responded to the call for evidence from the to House of Lords Communications Committee “Children and the Internet” inquiry. Following our submission at the end of August, Professor Derek McAuley was invited to give verbal evidence, which took place on October 11th [transcript] [video].
On June 14th CaSMa and Gada organized a joint workshop to explore the “youth civic engagement in the digital age”, which was funded by a seed-grant from the Governance and Public Policy RPA. The purpose of this workshop was to explore definitions and understanding around what youth civic engagement is (and also what is not), what motivates young people to engage and how to reach out to those whose voice is not being heard.
On Tuesday August 30th (2016), it was reported that the German government had asked Facebook to remove hateful and illegal posts more quickly, as part of its corporate social responsibility. Social Media companies however are typically reluctant to be very proactive in their approach to such removal, preferring to rely on notifications from the users, because they do not want to be seen to edit the content that is shared since this might lead to them being labelled a publisher. The moment a social media company becomes a publisher it would become liable to media regulations and open to libel laws. This was also the position that Zuckerberg reaffirmed one day earlier during a Q&A in Italy where he said: “No, we’re a tech company, we’re not a media company,” Facebook builds “the tools, we do not produce any of the content.”
Weaponisation of artificial intelligence (AI) presents one of the greatest ethical and technological challenges in the 21st century and has been described as the third revolution in warfare, after the invention of gunpowder and nuclear weapons. Despite the vital importance of this development for modern society, legal and ethical practices, and technological turning point, there is little systematic study of public opinion on this critical issue. This interdisciplinary project addresses this gap. Our objective is to analyse what factors determine public attitudes towards the use of fully autonomous weapons. To do this, we put the public at the center of the policy debate, starting with youth engagement in political and decision-making processes.