Tag Archives: anonymity

Watching Privacy at Apple Watch

Wearables_cloudComputeBased primarily on the past success of the Apple marketing machine, there is a great expectation that the Apple Watch will give a dramatic push to the sale of wearable technology. For CaSMa wearable tech of the Apple Watch variety raises a number of potential concerns relating to the way in which the data is managed and the limited control that users often have over this information stream.

Continue reading Watching Privacy at Apple Watch

Ansgar & Chris, in The Conversation this week!

What use would a digital bill of rights be?

Ansgar Koene, University of Nottingham and Chris James Carter, University of Nottingham

The Magna Carta, no relation to Chris.

The Liberal Democrats have been a lone voice among the parties calling for a digital bill of rights governing our growing use of the internet. But is it the right solution for the problem in hand?

Surveys suggest that the bill should pique the interest of at least a few floating voters, with almost three-quarters of British adults in one survey concerned over unauthorised access to their private information online.

Continue reading Ansgar & Chris, in The Conversation this week!

A wave of calls for citizen rights on the Internet

1361600620It’s been a busy week in the world of digital rights. On April 11th the UK’s Liberal Democratic party decided to put digital rights on the election campaign agenda by launching a proposal for a Digital Bill of Rights. On April 15th, the Global Commission on Internet Governance released a statement titled “Towards a Social Compact for Digital Privacy and Security” in the run up to the 2015 Global Conference on Cyber Space in the Hague, which culminated with the launch of the Global Forum on Cyber Expertise.

Continue reading A wave of calls for citizen rights on the Internet

BAAL workshop on Ethics of Online Research Methods

BAAL Language and New Media SIG 2015 Workshop

Workshop aims

Today, more than ever, data are widely accessible, visible, and searchable for research in digital media contexts. At the same time, new data types and collection methods challenge existing approaches to research ethics and raise significant and difficult questions for researchers who design, undertake and disseminate research in and about digital environments.
The aims of this workshop are to bring together researchers who use online research methods and data in different subfields of applied linguistics, to discuss ethical considerations in online data collection and analysis, to identify challenges and share solutions to ethical issues arising from applied linguistics research.

Keynote speakers

Alexandra Georgakopoulou (King’s College London)
Claire Hardaker (Lancaster University)
Annette Markham (Aarhus University, Denmark)
Stephen Pihlaja (Newman University, Birmingham)

Contact Details

Tereza Spilioti, email: SpiliotiT1@cardiff.ac.uk
Helen Clifford, email : encap-events2015@cardiff.ac.uk


 

CaSMa provided two contributions to this workshop:

Ansgar Koene presented: Participant Consent and Withdrawal when using publicly archived data

Abstract: In this paper we start by critically analysing the publicness of various types of online archived data and discussing under which conditions such archives relieve researchers from their ethical requirement to seek consent from, and provide opportunities for withdrawal to, participants. We will argue that, from the perspective of the participants who contribute data to online platforms, most online data cannot be classified into binary categories of public of private, but lies within a spectrum between these extremes. If we consider, for instance, the way in which many people use Twitter, we observe that twitter conservations often take the form of discourses within friend networks rather than public announcements. By analogy we might consider such conversations akin to a discussion between acquaintances in a public space, like a café. While the participants are willing to accept that their conversation can be heard by other people in same space, they would not be comfortable with the idea that someone is systematically eavesdropping and analysing their discussion. We should note in this context that the fact Twitter automatically archives such conversations is not consciously considered by most Twitter users. In the second half of our paper we take a closer look at the case of Twitter for which we will discuss a range of participant consent and withdrawal procedures. We will present the outcome of a feasibility pilot in which we surveyed the willingness of Twitter users to provide informed consent for having past tweets analysed when the specific research question is clearly explained to them. Based on this survey we will also discuss if, and how, the filtering of participants based on willingness to consent might skew the resulting data. We conclude with a set of recommendations for participant consent and withdrawal procedures to be used when accessing online archived data.

 

Elvira Perez Vallejos presented: Ethical considerations for online mental health communication research

Abstract: Young people experience severe and potentially long-lasting psychological difficulties, yet many perceive difficulties in communicating their concerns to professionals and only a fraction receive available support services. We propose to investigate the linguistic strategies with which adolescents present mental health concerns in online settings, the barriers they identify in communicating their experiences and how these might be alleviated. The study will utilise an interdisciplinary team, applying expertise in applied linguistics and human computer interaction to elucidate adolescents’ expressions and experiences of psychological distress. Relevant for this workshop will be the ethical considerations regarding informed consent, trust, anonymity issues, privacy and the right to withdraw. We will discuss the potential challenges regarding data access and analysis from a user centric perspective and potential solutions such as explicit opt-out/opt-in recruitment strategies. The results will inform subsequent planning and design of online research including vulnerable young people.

iRights Youth Juries

iReight_goup

On April 9th the first two iRights Youth Juries were held at University of Nottingham. In collaboration with the civil society initiative, iRights, and Prof. Coleman’s lab from University of Leeds, CaSMa will be running 12 Youth Juries to allow children and young people to have a say about their rights on the internet. At the Youth Juries groups of 10 to 15 participants, aged 12-17, are asked to consider, debate and share ideas about the future of the internet.

Continue reading iRights Youth Juries

CaSMa at the ICISSP 2015

ICISSP2015_bannerFrom February 9 to 11 Ansgar participated on behalf of CaSMa at the ICISSP 2015 (1st International Conference on Information System Security and Privacy) conference in Angers, France. The conference featured talks covering both technical and social issues that were addresses both from practical and theoretical perspectives. Topics included Data and Software Security, Trust, Privacy and Confidentiality, Mobile Systems Security, and Biometric Authentication.

Continue reading CaSMa at the ICISSP 2015

Insights from the workshop on social media analysis and mental health: Putting people at the centre of human data

LOGO - Institute of Mental Health Nottingham

In collaboration with the Institute of Mental Health and CAS (Centre for Advance Studies), the CaSMa team held a thought-provoking workshop that invited us all to reflect and re-think on concerns over social media data, especially when vulnerable adult and minors may inadvertently being part of it.

National experts including Monica Whitty, Karen Douglas, Jens Binder and Ilka Gleibs engaged with their audience to illustrate a series of relevant ethical aspects and their implications not only on internet mediated research aspects but for our day-to-day activities.

Continue reading Insights from the workshop on social media analysis and mental health: Putting people at the centre of human data