As part of the continuing theme on Data Driven Innovation, Nesta published an article on their blog with the title “Striking a balance: Data protection vs. Data Driven Innovation”. In it they call for a debate for establishing the right balance between data protection and data driven innovation, to ensure that the UK economy does not suffer but also that personal data is not misused.
As researcher working on issues of ethics and citizen rights relating to social media and internet mediated research I welcome this call by Nesta for a debate on establishing a balance between the concerns of data providers (citizens) and data users. While the issue is often referred to as a debate on privacy, Nesta rightly points out that privacy concerns are only one part of the greater debate that is to be had around digital & data rights.
Before looking at the finer points of this debate however, I would like to draw attention to the fact that this debate is not taking place in a vacuum. Rather, it is part of a much larger debate about the social fabric of society and the relationship between the rights of ordinary people, i.e. ‘citizens’, and the behaviours of larger organizations, i.e. corporations and governments. Within this larger picture, the terminology of ‘consumers’ for referring to the citizens, whose data rights are being discussed, risks imposing a corporate worldview that will alienate people who refuse to be treated like nothing more than pawns on corporate chess games.
Returning to the main issue of the debate, the way in which the discourse if formulated as a clash of Data protection vs. Data Driven Innovation is unnecessarily confrontational. Instead, let us start by acknowledging that all parties agree that there is great value in data and that proper use of data can lead to great improvements. But let us also acknowledge that it is unreasonable to expect anyone to be comfortable with the idea of handing over increasing amounts of personal information for a vague promise of some kind of future improvements, where the receiver effectively acknowledges that they have no idea what they will do with the data.
Data Driven Innovation has to move away from the current model in which massive collection of personal data is seen as a starting point that will later be followed by attempts to find some way of extracting something out of the data that can be made to turn a profit. This business model naturally leads people to fear that even a company that starts with the best intentions may, when money starts to run out and they still haven’t found a good ethical way to make a profit from the data, they might turn to unethical means to save themselves from going bankrupt.
Framed in corporate terminology, people who are contributing their data have the same concerns as investors who are investing their money into a company. Investors, rightly, ask to see a business plan with a clearly defined strategy outlining how the company intends to guarantee that they will make a profit on their investment. People who’s private data is mined have the same concerns about wanting to know that their data is well used.
With regards to existing data protection regulations, while it is true that most citizens are unaware of the regulations, or the existence of the Information Commissioner’s Office, the reality is that this lack of awareness is at least in part because there is so little apparent impact of the ICO regulations on corporate behaviour regarding data usage. One important reason is of course that many of the companies that are handling people’s data are based outside the jurisdiction of the ICO. The other however is that often the regulations (or at least their enforcement) appear to allow ‘intended use of data’ to be defined too broadly, such that companies can still collect data with little prior idea of what they will do with it.
In the absence of alternative business models around the use of people’s data, the default route to monetization still appears to be to use it for targeted advertising. Much of the debate around DDI and advertising is framed in terms of people’s fear of having private data traced to them as an individual, through a lack of sufficient anonymization, leading to direct violation of their privacy. These fears are real, and this is an issue that requires debate, however it is not the only concern. Even if data is maintained as completely anonymous, people can still feel uncomfortable with having detailed profiles of their person ‘type’ being produced and used to influence their behaviour. There is a real concern that sophisticated personality type profiling will lead to increasingly successful behaviour manipulation, robbing people of their own free will. Not so much “big brother” more “brave new world”. Even in the 80s, when commercial TV Saturday morning kids TV shows became huge, people expressed concern about the advertising targeted at children. No one was concerned about the privacy of their children, but they were concerned about manipulation of their children’s interests (purchasing desires).
At CaSMa we are working to develop citizen centric approaches to social media analysis and promote ethical research practices for internet mediated research. In our view one of the challenges for academia should be to provide examples of analysis methods that establish the viability of doing data driven innovation with the highest ethical standards and respect for the dignity and privacy of individuals.