Hidden costs of personalized information services?

RecommenderSystemPersonalized information filtering by online search engines, social media, news sites and retailers represents a natural evolution in the development towards ever more finely tuned interaction with the users. Since the internet provides an overwhelming quantity of information on most topics, information overload has become one of the main concerns for users. Perceived quality of information services is therefore strongly determined by the ease with which the user can obtain some information that satisfies their current desires. For many of the most highly success internet service, like Google, Amazon.com, YouTube, Netflix and TripAdvisor, the recommender system is a key element in their success over rival services in the same sector. Some, like Netflix, openly acknowledge this even to the extent of awarding large prizes for anyone that can improve their recommender system.

There are however numerous social and ethical concerns that are introduced by the use of these filters. One of the social concerns is the fear that optimizing people’s information flows to focus on those things they have previously shown an interests/affinity for may cause a feedback loop by which people become isolated from new information due to a self-reinforcing filter bubble. To what extent this can, or does, happen as a consequence of personalized filtering, is not yet clear. A theoretical analysis by Van Alstyne & Brynjolfsson showed that, under certain conditions, such a scenario is possible, however, little experimental work has been done to verify if the ‘filter bubble’ scenario is taking place. Most of the research on the impact of personalization and recommender systems has so far focused on their commercial success in increasing sales, web impressions, and their ability to increase the consumer interest for niche goods.

The user profiling required to achieve personalized information services, however, also raises numerous ethical issues around privacy and data protection. Profiling of user behaviour is usually based for a large part on data about the past search and browsing behavior of the users when they previously interacted with the service. To further refine the user profiles, services may also gather information about the user behavior on other websites through the use of ‘tracking cookies’ or by purchasing third-party access to such data from other services. Additionally, recommender systems may also use data concerning the behavior of people within the social network of the users.

The use of ‘tracking cookies’ is clearly the most troubling, however even the logging of users’ behavior when they are actively engaging with the information service itself lacks proper informed consent. Any information in the service T&Cs concerning this data logging is unlikely to be read and/or understood by users.

Beyond the data collection process, the user profile that determines the personalized information filtering is built to anticipate the user’s behavior, interests and desires. Such person profiles could therefore be used to plan targeted phishing campaigns or hacking related social engineering.

Further concerns arise due to the lack of transparency and the potential for increasingly covert manipulation of user behaviour in favour of the commercial interests of the predominantly advertising based business models of information services.

Due to the commercial advantage which service providers hope to achieve through the personalized information filtering, the algorithms are typically not made public. This lack of transparency makes it impossible for users to fully understand of how their data is being gathered and used, thus preventing any true informed consent. A key concern is the fact that most of the services do not earn their money from the users but rather from corporate customers who pay for targeted advertising. This directly raises the questions as to how much and for which agenda users are being manipulated by the personalized filtering. Advertising, of course is all about attempting to persuade, i.e. manipulate, potential consumers into purchasing the product/service being advertised. In case there was any doubt, the willingness of information service providers to engage in manipulating their information filtering for purposes other than the service to the user was clearly demonstrated by the “Facebook news-feed manipulation experiment”.

As part of our work towards developing ethical citizen centered internet tools, CaSMa will present a paper on this topic at the 2nd International Conference on Internet Science “Societies, Governance and Innovation”. The full paper and slides from the talk will be made available after the conference.

In the meanwhile, please feel free to participate in our survey on the conditions you would want to have fulfilled for you to consider giving access to your social media data for research purposes.

One thought on “Hidden costs of personalized information services?”

Go on, leave us a reply!