Psychological profiling for a shopping bargain

iBeacons

Comment on “Is sending shoppers ads by Bluetooth just a bit creepy?” in the Conversation.

Professor Angela Sasse and Dr Charlene Jennett, based at the UCL Interaction Centre (UCLIC), are interested in understanding how people interact with technology and in particular, the use of proximity ‘beacons’. iBeacon is one such indoor proximity system that can trigger actions on smart phones and other devices. This new technology has already been trialled in the retail sector to simplify payments and enable on-site offers and personalised adverts to customers. Whilst seemingly offering consumers a quicker and more streamlined shopping experience, the application of the technology also raises a number of ethical issues that require consideration.


According to studies by the UCL Interaction Centre, personalised ads are more likely to be noticed by customers but the probability of accepting or rejecting a tailored advert will depend on the customers’ specific needs at that exact time, which are not easy to predict. This ‘unpredictable’ decision-making behaviour poses one of the biggest challenges for smartphone advertising. Obtaining sensitive, fine-grain data on the customer’s emotional state and information about their immediate circumstances could generate more accurate and profitable predictions but at a potentially high price to the privacy and psychological integrity of the customer who’s personal and intimate data is being mined.

Even if developers of technologies such as the iBeacons are well intentioned, it is important to remember that the primary purpose is to serve the interest of businesses who are trying to sell products more effectively. In exchange for easier access to products and special offers, prospective users (that is, customers) are asked to provide a deep level of access to the data of their private lives. Personal information is then processed by companies to build psychological profiles that can be matched to specific marketing strategies. If the service is provided using the standard methodology of ‘big data’ services, the users transmit their data to a central database, maintained by the service provider company.
From this point onwards, the power balance in the relationship between the user and the service provider is heavily tilted towards the service provider. With typically very limited transparency the user will have to trust that the service provider: 1) has sufficient security to safeguard against theft of the user’s data; 2) will analyse the data only to the extent necessary for the purpose of providing the agreed upon service; 3) will not sell the data to third parties; 4) will properly remove all the data they hold of the user if/when the user indicates a wish to withdraw from the service; 5) will refrain from using the data to nudge the user into irresponsible spending behaviour.

Experiments such as those performed by Prof. Sasse and Dr Jennett can help us to understand human behaviour and have the potential to greatly improve and optimize people’s daily activities. At the same time, however, the ethical issues associated with studies serve as a reminder that researchers have a responsibility to consider the potential power imbalance between the user and the service provider that can be introduced by devices such as the iBeacon, and should contribute to the development of new operating principles to safeguard the rights of individuals.

With this in mind, the CaSMa project is looking at solutions that put people at the centre of human data by introducing a novel citizen-centred approach to social media analysis. CaSMa aims to promote ways for individuals to control their data and their desired level of privacy, including mechanisms that make it realistically possible to implement a withdrawal of consent. Indeed, one of the core CaSMa objectives is to ensure that social media users are aware of how their personal data can be used to understand human behaviour and the ethics of handling human data obtained from online sources.