On Friday 21st October a botnet of hacked Internet or Things devices launched a massive DDoS attack on a DNS service provider causing major disruption to services like PayPal, Twitter and Netflix. To many security experts familiar with IoT it was only matter of time before this would happen. Assuming that this will act as a wakeup call, what can be done to improve IoT cybersecurity?
Our colleagues on the Digital Wildfires project teamed up with Oxford Sparks and Jason R.C. Nurse to produce a new video animation for young people about the joys and challenges of social media information. In the words of the Digital Wildfires team: “Keeping Social Media depicts the ways that social media have revolutionised the ways we communicate. Whiles these platforms open up an unimagined volume of ideas and possibilities they also offer anonymity, which increases the chance that both children and adults may take risks and experiment with behaviours they may not consider offline. Our video describes how research can help find ways to tackle some of the challenges posed by social media and invites the viewer to consider how these digital social spaces should be regulated.”
Building on the results from our work on the iRights Youth Juries, CaSMa responded to the call for evidence from the to House of Lords Communications Committee “Children and the Internet” inquiry. Following our submission at the end of August, Professor Derek McAuley was invited to give verbal evidence, which took place on October 11th [transcript] [video].
On Friday 23 September I attended a workshop on “RRI in the UK: the post Brexit future?” that was organized by Prof. Bernd Stahl (DeMontford U.) to discuss with UK researchers engaged with the Responsible Research and Innovation agenda how the current state of RRI in the UK, and where the research field might head next. One of the stated aims of the workshop was to “look to develop a strategy/roadmap, which enables all UK academics working in this field to feel that there is a way forward” [if/when EU funding for RRI is no longer available post-Brexit].
On June 14th CaSMa and Gada organized a joint workshop to explore the “youth civic engagement in the digital age”, which was funded by a seed-grant from the Governance and Public Policy RPA. The purpose of this workshop was to explore definitions and understanding around what youth civic engagement is (and also what is not), what motivates young people to engage and how to reach out to those whose voice is not being heard.
On Tuesday August 30th (2016), it was reported that the German government had asked Facebook to remove hateful and illegal posts more quickly, as part of its corporate social responsibility. Social Media companies however are typically reluctant to be very proactive in their approach to such removal, preferring to rely on notifications from the users, because they do not want to be seen to edit the content that is shared since this might lead to them being labelled a publisher. The moment a social media company becomes a publisher it would become liable to media regulations and open to libel laws. This was also the position that Zuckerberg reaffirmed one day earlier during a Q&A in Italy where he said: “No, we’re a tech company, we’re not a media company,” Facebook builds “the tools, we do not produce any of the content.”
“Made in the EU with GDPR inside”, will this be the new label to look for when seeking a quality online service with reliable privacy guarantees?
The POET (Public Outreach Engagement Tool) project is currently running a second round of interviews to map out how academics at the University of Nottingham are using social media in the context of public engagement, especially in regards to the Impact and Responsible Research and Innovation (RRI) agendas. We previously conducted one round of interviews with science researchers, and have held a workshop to think about what the output of this tool could look like. Our project is still at the stage where we are collecting information about how social media is used by academics as part of their working day – to what extent it is used, the feelings associated with using it, whether their motivations for using it are work related, and whether this tool would be useful for any current public engagement work.
Weaponisation of artificial intelligence (AI) presents one of the greatest ethical and technological challenges in the 21st century and has been described as the third revolution in warfare, after the invention of gunpowder and nuclear weapons. Despite the vital importance of this development for modern society, legal and ethical practices, and technological turning point, there is little systematic study of public opinion on this critical issue. This interdisciplinary project addresses this gap. Our objective is to analyse what factors determine public attitudes towards the use of fully autonomous weapons. To do this, we put the public at the center of the policy debate, starting with youth engagement in political and decision-making processes.
Have you ever actually read the terms and conditions before signing up to a website or ordering something online? These long, wordy documents are a form of consumer protection designed to make sure we are fully informed when we agree to an online contract. They are supposed to ensure we are making a conscious decision to sign up to a service with full knowledge of the consequences.