Learning from the mistakes of others is perhaps one of the most valuable lessons that the Samaritans Radar has offered to research communities concerned about privacy issues and the ethical treatment of social media data, from collection through to analysis.
The Samaritans Radar app was undoubtedly designed with good intentions, with the aim of saving lives. However, it was eventually suspended following numerous complaints and reports of it contributing to genuine distress. The app was designed to identify tweets that included one of the 75 key words associated to real suicide attempts. The app would then alert the Twitter community linked to the specific user. These 75 distress keywords were identified by Professor Jonathan Scourfield and, with the permission of the Department of Health, shared with the Samaritans. Interestingly, this Cardiff University project titled ‘Understanding the role of Social Media in the Aftermath of Youth Suicides’ is funded by the Department’s Policy Research Programme (PRP). This is interesting because the study used the Cardiff Online Social Media Observatory (COSMOS) to collect and explore social media data (especially via Twitter, Tumblr and Facebook). COSMOS provides ethical guidelines to social media research and states that social research ethics are ‘at the core of the COSMO programme’. However, even research groups dedicated to ensure high ethical standards around social media research, can inadvertently participate in projects that have the potential to generate distress and breach the trust, respect and privacy of vulnerable individuals with mental health problems.
All parties involved in the development and implementation of this well-intentioned app underestimated people’s sensitivity and needs regarding privacy. Some experts accused the charity of not having sufficient familiarity with data issues and social media communities to embark on such an ambitious project. Additionally, in his article for The Conversation, Professor Jonathan Scourfield mentioned the issue of ‘creeping surveillance’ and the need for a debate about the ‘balance of protection and civil liberties’. Contrary to public opinion, however, rather than being the ones to blame, we might actually view the Samaritans as the unwitting victims of research negligence. Ultimately, it is the Samaritans’ reputation and trust that has been damaged, while minimal debate has focused on the ethical guidelines and research integrity that drove the development and implementation of the Samaritans Radar turmoil.
There are two key ways in which researches studying social media -such as CaSMa- can learn from the mistakes made in this case. Firstly, by ensuring that any research activity is respectful of people’s need for autonomy and privacy. Secondly, promoting real solutions for anonymisation and serving the genuine need for citizens to be able to protect their digital rights to privacy.