Based primarily on the past success of the Apple marketing machine, there is a great expectation that the Apple Watch will give a dramatic push to the sale of wearable technology. For CaSMa wearable tech of the Apple Watch variety raises a number of potential concerns relating to the way in which the data is managed and the limited control that users often have over this information stream.
One of the recent trends for sensorized smart devices, be they smart phones, smart TVs or indeed smart watches, is to boost their capabilities by offloading processing requirements into the cloud. While this decision makes technological sense, the fact that it requires raw data to bet transmitted out from the device, and away from the immediate control of the user, introduces privacy concerns. In some cases, e.g. Samsumg Smart TV, this has already lead to controversy. It also puts extra pressure on the available bandwidth of mobile phone network and eats into people’s mobile-data allowances.
The two Apple Watch features that have received the most attention regarding privacy are Apple Pay and Apple’s HealthKit. Voice activation and control through the use of Siri has received much less attention even though the small screen/touch-surface makes voice control and dictation much more attractive for the watch form factor than it was on the phone.
For Apple Pay, the contact-less payment system that relies on near-field communication to wirelessly exchange information between devices, Apple CEO Tim Cook sought to put to rest any fears that Apple might monitor purchase behaviour, stating that: “If you use your phone to buy something on Apple Pay, we don’t want to know what you bought, how much you paid for it and where you bought it. That is between you, your bank and the merchant … It is a cop-out to say: choose between privacy or security. There is no reason why customers should have to select one. There is no reason not to have both.” According to the product information provided by Apple, “Apple does not store any payment information on the devices or on Apple’s servers. It simply acts as a conduit between the merchant and bank”. Under this operating model, Apple does indeed appear to have avoided privacy concerns relating to Apple Pay.
When it comes to HealthKit, Apple is again presenting itself with strong pro-active claims about their concerns for the privacy of their users. Jeff Williams, Apple’s head of operations, noted that for the Apple Watch, “Apple is forbidding app developers from storing any health information on Apple’s iCloud service. All health information logged by the watch is encrypted on the device and users decide which apps have access to the data.” While this sounds very consumer minded, it should be noted that it is not actually much different from the standard model for smart phone apps where users have to tick a consent box during installation to allow the app to access information like GPS or contact lists. Also, even though app developers are forbidden from storing health data on the iCloud service, there are no restrictions on the use of other cloud storage. A more meaningful app developers restriction is provided by the HealthKit guidelines, statement that “apps working with HealthKit, may not use the personal data gathered for advertising or data-mining uses other than for helping manage an individual’s health and fitness, or for medical research.” Cynics might however counter that it will probably only be a matter of time before “helping manage an individual’s health and fitness” is interpreted as a valid reason for using the data to drive advertisements for health food or other health products and services. No doubt inspired by the calls for better data security and improved data encryption following the revelations about NSA/GCHQ snooping, the security of the data transfers between watch, phone and cloud appear to have been well fortified, as described on Appel’s ‘Privacy-Built-In’ information pages: “Your data in the Health app and your Health data on Apple Watch are encrypted with keys protected by your passcode. Your Health data only leaves or is received by your iPhone or Apple Watch when you choose to sync, back up your data, or grant access to a third-party app. Any Health data backed up to iCloud is encrypted both in transit and on our servers.”
While the heath and payment function of the Apple Watch have been drawing most of the attention, the biggest potential threat to privacy may in fact be in the voice control options provided by the Siri function. As stated in the Siri Privacy statement: ”When you use Siri, which includes the dictation feature of your device, the things you say and dictate will be recorded and sent to Apple to process your requests”. This means that in order to recognize the voice activation phrase for Siri, “Hey Siri”, all data received by the microphone must be sent and analysed by Apple even when Siri is officially not ‘active’. This is the same situation as with the voice activation of the Samsung Smart TV. The Siri documentation further states that: “To help them recognize your pronunciation and provide better responses, certain User Data such as your name, contacts, and songs in your music library is sent to Apple servers using encrypted protocols. Siri and Dictation do not associate this information with your Apple ID, but rather with your device through a random identifier. Apple Watch uses the Siri identifier from your iPhone.” According to Apple’s Muller, the company takes steps to ensure that the data is anonymized and only collects the Siri voice clips in order to improve Siri itself.
Unfortunately, what Apple refers to as anonymization would more accurately be described as de-identification by simply stripping the data of immediate person identifier such as the Apple ID. Due to the high level of individual differences in voice patterns, actual annonymization of voice data is not achievable as long as the actual voice patterns are retained. It is as if one were to try to annonymize fingerprints. Additionally, as has become increasingly clear from social media data, any large enough dataset of personal online activities can invariably be linked to other publicly available data that will allow the person to be identified. Examples of these were shown by the de-anonymization of users in the AOL Search Log by journalists of the New York Times in 2006 (Barbaro and Zeller, 2006) and the 2008 paper by Narayanan and Shmatiko showing that the Netflix prize dataset, which contained only ratings of movies by de-identified users, was rich enough to allow the users to be identified through correlation with publicly available data sets.
- We may collect information such as occupation, language, zip code, area code, unique device identifier, referrer URL, location, and the time zone where an Apple product is used so that we can better understand customer behavior and improve our products, services, and advertising.
- We may collect and store details of how you use our services, including search queries. This information may be used to improve the relevancy of results provided by our services. Except in limited instances to ensure quality of our services over the Internet, such information will not be associated with your IP address.”
Clearly, in light of the previously cited example of de-anonymization, it would be trivially simple to use this so-called ‘non-personal information’ to de-anonymize the data.
Based on various statements by Tim Cook, CEO of Apple, earlier this year, Apple does seem to be acutely aware of the general sensitivity about privacy in the population since Edward Snowden’s revelations about the NSA, which included documents showing Apple’s involvement in the NSA PRISM programme. Speaking to the UK Telegraph in February for instance, Cook said that “None of us should accept that the government or a company or anybody should have access to all of our private information. This is a basic human right. We all have a right to privacy. We shouldn’t give it up. We shouldn’t give in to scare-mongering or to people who fundamentally don’t understand the details.” While the sentiment in such statements is good, a closer investigation of their policies shows that there are still many issues of concern regarding privacy. Especially with the introduction of new types of sensorized devices, such as wearables, it is necessary to make sure we get the privacy settings right from the start. Otherwise we risk a situation a couple of years from now where another tech CEO is temped into statements like Zuckerberg’s infamous proclamation that ‘privacy is no longer a social norm’, simply because people were not sufficiently able to control their privacy as they would like.